Mar 10 00:06:24 crc systemd[1]: Starting Kubernetes Kubelet... Mar 10 00:06:24 crc restorecon[4706]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:24 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:25 crc restorecon[4706]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 10 00:06:26 crc kubenswrapper[4994]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 00:06:26 crc kubenswrapper[4994]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 10 00:06:26 crc kubenswrapper[4994]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 00:06:26 crc kubenswrapper[4994]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 00:06:26 crc kubenswrapper[4994]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 10 00:06:26 crc kubenswrapper[4994]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.274747 4994 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280228 4994 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280266 4994 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280288 4994 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280300 4994 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280312 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280323 4994 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280336 4994 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280346 4994 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280356 4994 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280369 4994 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280381 4994 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280392 4994 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280401 4994 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280410 4994 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280418 4994 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280426 4994 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280434 4994 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280442 4994 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280451 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280460 4994 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280497 4994 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280509 4994 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280517 4994 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280528 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280536 4994 feature_gate.go:330] unrecognized feature gate: Example Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280544 4994 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280552 4994 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280559 4994 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280567 4994 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280575 4994 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280582 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280590 4994 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280597 4994 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280605 4994 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280612 4994 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280620 4994 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280629 4994 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280637 4994 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280646 4994 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280653 4994 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280661 4994 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280669 4994 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280677 4994 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280685 4994 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280692 4994 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280700 4994 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280708 4994 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280717 4994 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280725 4994 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280733 4994 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280740 4994 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280748 4994 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280755 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280763 4994 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280770 4994 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280778 4994 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280786 4994 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280794 4994 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280801 4994 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280811 4994 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280818 4994 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280826 4994 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280834 4994 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280841 4994 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280849 4994 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280858 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280865 4994 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280910 4994 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280920 4994 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280929 4994 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.280938 4994 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282633 4994 flags.go:64] FLAG: --address="0.0.0.0" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282658 4994 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282678 4994 flags.go:64] FLAG: --anonymous-auth="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282689 4994 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282700 4994 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282710 4994 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282722 4994 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282734 4994 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282743 4994 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282752 4994 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282761 4994 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282772 4994 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282782 4994 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282790 4994 flags.go:64] FLAG: --cgroup-root="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282799 4994 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282808 4994 flags.go:64] FLAG: --client-ca-file="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282816 4994 flags.go:64] FLAG: --cloud-config="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282825 4994 flags.go:64] FLAG: --cloud-provider="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282833 4994 flags.go:64] FLAG: --cluster-dns="[]" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282843 4994 flags.go:64] FLAG: --cluster-domain="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282852 4994 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282861 4994 flags.go:64] FLAG: --config-dir="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282907 4994 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282920 4994 flags.go:64] FLAG: --container-log-max-files="5" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282931 4994 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282941 4994 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282951 4994 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282961 4994 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282970 4994 flags.go:64] FLAG: --contention-profiling="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282979 4994 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.282990 4994 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283002 4994 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283010 4994 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283021 4994 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283030 4994 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283039 4994 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283047 4994 flags.go:64] FLAG: --enable-load-reader="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283056 4994 flags.go:64] FLAG: --enable-server="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283065 4994 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283077 4994 flags.go:64] FLAG: --event-burst="100" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283086 4994 flags.go:64] FLAG: --event-qps="50" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283095 4994 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283104 4994 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283113 4994 flags.go:64] FLAG: --eviction-hard="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283124 4994 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283133 4994 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283142 4994 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283151 4994 flags.go:64] FLAG: --eviction-soft="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283160 4994 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283169 4994 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283178 4994 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283186 4994 flags.go:64] FLAG: --experimental-mounter-path="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283195 4994 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283204 4994 flags.go:64] FLAG: --fail-swap-on="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283213 4994 flags.go:64] FLAG: --feature-gates="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283223 4994 flags.go:64] FLAG: --file-check-frequency="20s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283232 4994 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283241 4994 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283250 4994 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283259 4994 flags.go:64] FLAG: --healthz-port="10248" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283268 4994 flags.go:64] FLAG: --help="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283276 4994 flags.go:64] FLAG: --hostname-override="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283286 4994 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283301 4994 flags.go:64] FLAG: --http-check-frequency="20s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283313 4994 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283324 4994 flags.go:64] FLAG: --image-credential-provider-config="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283336 4994 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283348 4994 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283357 4994 flags.go:64] FLAG: --image-service-endpoint="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283365 4994 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283375 4994 flags.go:64] FLAG: --kube-api-burst="100" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283384 4994 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283393 4994 flags.go:64] FLAG: --kube-api-qps="50" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283402 4994 flags.go:64] FLAG: --kube-reserved="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283410 4994 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283420 4994 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283429 4994 flags.go:64] FLAG: --kubelet-cgroups="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283438 4994 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283446 4994 flags.go:64] FLAG: --lock-file="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283455 4994 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283464 4994 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283472 4994 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283486 4994 flags.go:64] FLAG: --log-json-split-stream="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283495 4994 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283503 4994 flags.go:64] FLAG: --log-text-split-stream="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283512 4994 flags.go:64] FLAG: --logging-format="text" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283522 4994 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283532 4994 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283540 4994 flags.go:64] FLAG: --manifest-url="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283549 4994 flags.go:64] FLAG: --manifest-url-header="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283561 4994 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283570 4994 flags.go:64] FLAG: --max-open-files="1000000" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283580 4994 flags.go:64] FLAG: --max-pods="110" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283589 4994 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283598 4994 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283608 4994 flags.go:64] FLAG: --memory-manager-policy="None" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283617 4994 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283626 4994 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283635 4994 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283643 4994 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283665 4994 flags.go:64] FLAG: --node-status-max-images="50" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283674 4994 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283685 4994 flags.go:64] FLAG: --oom-score-adj="-999" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283695 4994 flags.go:64] FLAG: --pod-cidr="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283703 4994 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283716 4994 flags.go:64] FLAG: --pod-manifest-path="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283725 4994 flags.go:64] FLAG: --pod-max-pids="-1" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283734 4994 flags.go:64] FLAG: --pods-per-core="0" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283744 4994 flags.go:64] FLAG: --port="10250" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283753 4994 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283761 4994 flags.go:64] FLAG: --provider-id="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283770 4994 flags.go:64] FLAG: --qos-reserved="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283779 4994 flags.go:64] FLAG: --read-only-port="10255" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283788 4994 flags.go:64] FLAG: --register-node="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283797 4994 flags.go:64] FLAG: --register-schedulable="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283815 4994 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283830 4994 flags.go:64] FLAG: --registry-burst="10" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283839 4994 flags.go:64] FLAG: --registry-qps="5" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283848 4994 flags.go:64] FLAG: --reserved-cpus="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283857 4994 flags.go:64] FLAG: --reserved-memory="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283868 4994 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283906 4994 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283916 4994 flags.go:64] FLAG: --rotate-certificates="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283925 4994 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283935 4994 flags.go:64] FLAG: --runonce="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283943 4994 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283954 4994 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283963 4994 flags.go:64] FLAG: --seccomp-default="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283973 4994 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283982 4994 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.283992 4994 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284001 4994 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284010 4994 flags.go:64] FLAG: --storage-driver-password="root" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284019 4994 flags.go:64] FLAG: --storage-driver-secure="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284029 4994 flags.go:64] FLAG: --storage-driver-table="stats" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284037 4994 flags.go:64] FLAG: --storage-driver-user="root" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284046 4994 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284056 4994 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284067 4994 flags.go:64] FLAG: --system-cgroups="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284076 4994 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284090 4994 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284100 4994 flags.go:64] FLAG: --tls-cert-file="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284109 4994 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284119 4994 flags.go:64] FLAG: --tls-min-version="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284129 4994 flags.go:64] FLAG: --tls-private-key-file="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284137 4994 flags.go:64] FLAG: --topology-manager-policy="none" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284146 4994 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284155 4994 flags.go:64] FLAG: --topology-manager-scope="container" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284164 4994 flags.go:64] FLAG: --v="2" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284176 4994 flags.go:64] FLAG: --version="false" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284187 4994 flags.go:64] FLAG: --vmodule="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284198 4994 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.284208 4994 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284428 4994 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284441 4994 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284449 4994 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284458 4994 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284497 4994 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284506 4994 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284514 4994 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284522 4994 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284530 4994 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284537 4994 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284545 4994 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284553 4994 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284563 4994 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284574 4994 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284582 4994 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284590 4994 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284599 4994 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284606 4994 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284614 4994 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284621 4994 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284629 4994 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284638 4994 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284648 4994 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284658 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284667 4994 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284675 4994 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284684 4994 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284692 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284699 4994 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284708 4994 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284716 4994 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284724 4994 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284731 4994 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284739 4994 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284749 4994 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284758 4994 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284766 4994 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284777 4994 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284788 4994 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284802 4994 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284812 4994 feature_gate.go:330] unrecognized feature gate: Example Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284824 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284834 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284843 4994 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284853 4994 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284863 4994 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284904 4994 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284916 4994 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284926 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284936 4994 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284945 4994 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284955 4994 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284965 4994 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284975 4994 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284985 4994 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.284995 4994 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285004 4994 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285015 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285025 4994 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285036 4994 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285046 4994 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285057 4994 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285067 4994 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285078 4994 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285088 4994 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285097 4994 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285106 4994 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285114 4994 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285121 4994 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285129 4994 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.285137 4994 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.285995 4994 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.300560 4994 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.300982 4994 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301135 4994 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301151 4994 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301161 4994 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301171 4994 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301181 4994 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301193 4994 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301203 4994 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301212 4994 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301221 4994 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301230 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301240 4994 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301249 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301258 4994 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301267 4994 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301276 4994 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301285 4994 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301293 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301302 4994 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301313 4994 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301327 4994 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301338 4994 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301348 4994 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301357 4994 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301367 4994 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301376 4994 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301384 4994 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301395 4994 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301405 4994 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301415 4994 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301424 4994 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301433 4994 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301441 4994 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301454 4994 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301464 4994 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301473 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301481 4994 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301490 4994 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301498 4994 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301511 4994 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301522 4994 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301535 4994 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301544 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301554 4994 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301563 4994 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301572 4994 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301581 4994 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301590 4994 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301599 4994 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301608 4994 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301616 4994 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301624 4994 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301633 4994 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301641 4994 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301650 4994 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301658 4994 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301666 4994 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301675 4994 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301683 4994 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301692 4994 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301701 4994 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301710 4994 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301719 4994 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301727 4994 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301736 4994 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301745 4994 feature_gate.go:330] unrecognized feature gate: Example Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301754 4994 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301762 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301771 4994 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301780 4994 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301788 4994 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.301800 4994 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.301816 4994 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302161 4994 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302178 4994 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302188 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302197 4994 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302206 4994 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302214 4994 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302224 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302232 4994 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302241 4994 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302249 4994 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302258 4994 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302267 4994 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302276 4994 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302284 4994 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302293 4994 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302301 4994 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302310 4994 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302318 4994 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302327 4994 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302336 4994 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302344 4994 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302353 4994 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302361 4994 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302370 4994 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302378 4994 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302386 4994 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302395 4994 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302404 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302414 4994 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302435 4994 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302444 4994 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302454 4994 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302463 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302472 4994 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302483 4994 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302493 4994 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302502 4994 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302512 4994 feature_gate.go:330] unrecognized feature gate: Example Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302520 4994 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302529 4994 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302538 4994 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302546 4994 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302557 4994 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302565 4994 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302577 4994 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302587 4994 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302595 4994 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302604 4994 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302612 4994 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302621 4994 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302629 4994 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302637 4994 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302645 4994 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302654 4994 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302662 4994 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302671 4994 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302679 4994 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302687 4994 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302696 4994 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302705 4994 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302713 4994 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302721 4994 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302730 4994 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302742 4994 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302752 4994 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302761 4994 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302771 4994 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302781 4994 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302790 4994 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302799 4994 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.302807 4994 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.302821 4994 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.303905 4994 server.go:940] "Client rotation is on, will bootstrap in background" Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.308347 4994 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.315306 4994 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.315483 4994 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.319936 4994 server.go:997] "Starting client certificate rotation" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.319986 4994 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.321034 4994 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.346617 4994 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.349600 4994 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.349704 4994 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.373284 4994 log.go:25] "Validated CRI v1 runtime API" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.415018 4994 log.go:25] "Validated CRI v1 image API" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.417705 4994 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.423548 4994 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-10-00-01-09-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.423605 4994 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.456632 4994 manager.go:217] Machine: {Timestamp:2026-03-10 00:06:26.453614174 +0000 UTC m=+0.627320973 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:c9a6b1d9-12bb-4e1d-8072-25b4f73868f8 BootID:9894519f-677e-4b1e-80a1-f7e7d58a0619 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:98:67:e2 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:98:67:e2 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:4f:02:a8 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:b1:e2:a6 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:87:4f:5b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:70:d0:db Speed:-1 Mtu:1496} {Name:eth10 MacAddress:f2:2f:57:75:3c:d9 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:1e:40:c2:aa:a8:34 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.457098 4994 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.457298 4994 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.457943 4994 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.458262 4994 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.458333 4994 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.458668 4994 topology_manager.go:138] "Creating topology manager with none policy" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.458687 4994 container_manager_linux.go:303] "Creating device plugin manager" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.459378 4994 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.459431 4994 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.460203 4994 state_mem.go:36] "Initialized new in-memory state store" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.460359 4994 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.464439 4994 kubelet.go:418] "Attempting to sync node with API server" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.464479 4994 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.464522 4994 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.464545 4994 kubelet.go:324] "Adding apiserver pod source" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.464563 4994 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.468954 4994 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.469574 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.469681 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.470115 4994 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.470350 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.470482 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.472581 4994 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474515 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474558 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474574 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474589 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474612 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474625 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474639 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474660 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474676 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474691 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474710 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.474724 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.477073 4994 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.477788 4994 server.go:1280] "Started kubelet" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.479177 4994 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.479133 4994 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 10 00:06:26 crc systemd[1]: Started Kubernetes Kubelet. Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.480038 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.480736 4994 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.482203 4994 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.482292 4994 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.482606 4994 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.482639 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.482653 4994 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.482669 4994 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.483431 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.483523 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.483590 4994 server.go:460] "Adding debug handlers to kubelet server" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.483926 4994 factory.go:55] Registering systemd factory Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.483969 4994 factory.go:221] Registration of the systemd container factory successfully Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.484299 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="200ms" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.484348 4994 factory.go:153] Registering CRI-O factory Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.484414 4994 factory.go:221] Registration of the crio container factory successfully Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.484529 4994 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.484568 4994 factory.go:103] Registering Raw factory Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.484599 4994 manager.go:1196] Started watching for new ooms in manager Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.485830 4994 manager.go:319] Starting recovery of all containers Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.486184 4994 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.164:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b522282d4604f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.477744207 +0000 UTC m=+0.651450996,LastTimestamp:2026-03-10 00:06:26.477744207 +0000 UTC m=+0.651450996,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.503634 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.503729 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.503762 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.503792 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.503819 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.503844 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.503868 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.503935 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.503964 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504002 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504025 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504050 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504083 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504119 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504143 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504181 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504208 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504234 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504260 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504287 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504318 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504350 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504406 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504433 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504458 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504483 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504583 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504614 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504638 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504674 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504702 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504742 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504766 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504792 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504818 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504844 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504924 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504957 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.504985 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505013 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505039 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505067 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505093 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505118 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505146 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505171 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505201 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505232 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505260 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505287 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505315 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505340 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505374 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505400 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505429 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505457 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505483 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505510 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505535 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505562 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505588 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505700 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505732 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505776 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.505804 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507520 4994 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507592 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507630 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507660 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507692 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507744 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507774 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507805 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507833 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507861 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507929 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507961 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.507990 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508018 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508087 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508116 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508139 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508164 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508192 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508222 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508249 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508277 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508305 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508330 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508394 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508425 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508452 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508479 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508504 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508530 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508558 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508587 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508615 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508643 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508668 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508694 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508722 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508749 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508794 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508823 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508864 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508931 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508969 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.508999 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509029 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509060 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509094 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509122 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509148 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509179 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509206 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509231 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509259 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509284 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509309 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509334 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509362 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509394 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509420 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509446 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509473 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509500 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509534 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509565 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509592 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509621 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509662 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509692 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509719 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509748 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509776 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509805 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509831 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509859 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509931 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509961 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.509989 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510020 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510050 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510074 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510104 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510131 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510156 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510185 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510210 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510237 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510264 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510289 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510315 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510344 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510369 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510396 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510435 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510464 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510493 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510519 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510544 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510563 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510582 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510601 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510620 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510643 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510662 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510680 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510698 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510753 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510778 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510796 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510816 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510835 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510861 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510926 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510952 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510972 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.510993 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511013 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511032 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511052 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511079 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511105 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511130 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511155 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511176 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511196 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511218 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511238 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511259 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511279 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511302 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511327 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511354 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511380 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511400 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511419 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511443 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511471 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511498 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511527 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511552 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511574 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511594 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511614 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511635 4994 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511654 4994 reconstruct.go:97] "Volume reconstruction finished" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.511667 4994 reconciler.go:26] "Reconciler: start to sync state" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.520414 4994 manager.go:324] Recovery completed Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.538090 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.540727 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.540781 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.540800 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.542280 4994 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.542311 4994 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.542347 4994 state_mem.go:36] "Initialized new in-memory state store" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.549550 4994 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.552620 4994 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.552679 4994 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.552712 4994 kubelet.go:2335] "Starting kubelet main sync loop" Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.552783 4994 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 10 00:06:26 crc kubenswrapper[4994]: W0310 00:06:26.555464 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.555558 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.565110 4994 policy_none.go:49] "None policy: Start" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.566551 4994 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.566599 4994 state_mem.go:35] "Initializing new in-memory state store" Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.583393 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.628174 4994 manager.go:334] "Starting Device Plugin manager" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.628246 4994 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.628268 4994 server.go:79] "Starting device plugin registration server" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.628834 4994 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.628854 4994 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.629190 4994 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.629305 4994 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.629319 4994 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.640664 4994 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.652921 4994 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.653048 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.654400 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.654457 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.654470 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.654716 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.655082 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.655140 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.658539 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.658668 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.658690 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.658818 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.658925 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.658940 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.659692 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.659910 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.659980 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.660998 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.661041 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.661060 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.661182 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.661210 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.661227 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.661240 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.661462 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.661541 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.662450 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.662492 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.662511 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.662658 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.662686 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.662791 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.662837 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.662687 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.662942 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.664122 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.664174 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.664191 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.664220 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.664265 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.664303 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.664557 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.664615 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.665904 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.665944 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.665961 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.685865 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="400ms" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715451 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715557 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715596 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715631 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715674 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715734 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715809 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715913 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715953 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.715997 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.716025 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.716055 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.716084 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.716113 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.716142 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.728993 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.730240 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.730298 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.730320 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.730357 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.731026 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.164:6443: connect: connection refused" node="crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817501 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817560 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817601 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817630 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817657 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817696 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817726 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817757 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817765 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817791 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817819 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817826 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817856 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817854 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817934 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817948 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817911 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817977 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817996 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.817973 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818017 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818022 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818046 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818023 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818071 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818086 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818052 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818134 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818195 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.818269 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.931323 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.932837 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.932942 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.932963 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.933042 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:26 crc kubenswrapper[4994]: E0310 00:06:26.933528 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.164:6443: connect: connection refused" node="crc" Mar 10 00:06:26 crc kubenswrapper[4994]: I0310 00:06:26.999557 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.018225 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.032186 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.057739 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:27 crc kubenswrapper[4994]: W0310 00:06:27.059306 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-9ccfe79d4dfe6da753fc95167b67ebc134a794926f1a11929e3a732af141d1a4 WatchSource:0}: Error finding container 9ccfe79d4dfe6da753fc95167b67ebc134a794926f1a11929e3a732af141d1a4: Status 404 returned error can't find the container with id 9ccfe79d4dfe6da753fc95167b67ebc134a794926f1a11929e3a732af141d1a4 Mar 10 00:06:27 crc kubenswrapper[4994]: W0310 00:06:27.061245 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-6858f9b978a8d205cbc7547b7638cc779b2799ed14a8bc8a7d70dff467b6f893 WatchSource:0}: Error finding container 6858f9b978a8d205cbc7547b7638cc779b2799ed14a8bc8a7d70dff467b6f893: Status 404 returned error can't find the container with id 6858f9b978a8d205cbc7547b7638cc779b2799ed14a8bc8a7d70dff467b6f893 Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.069446 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:27 crc kubenswrapper[4994]: W0310 00:06:27.079939 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-6c46f93fa4d73cf6270f364f11272df6b45c18596808880a68c78370736f59ee WatchSource:0}: Error finding container 6c46f93fa4d73cf6270f364f11272df6b45c18596808880a68c78370736f59ee: Status 404 returned error can't find the container with id 6c46f93fa4d73cf6270f364f11272df6b45c18596808880a68c78370736f59ee Mar 10 00:06:27 crc kubenswrapper[4994]: E0310 00:06:27.086994 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="800ms" Mar 10 00:06:27 crc kubenswrapper[4994]: W0310 00:06:27.088032 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-28b6d77d1900211df3b731b1343563247180870acb2c2899bfb179c7793a50bd WatchSource:0}: Error finding container 28b6d77d1900211df3b731b1343563247180870acb2c2899bfb179c7793a50bd: Status 404 returned error can't find the container with id 28b6d77d1900211df3b731b1343563247180870acb2c2899bfb179c7793a50bd Mar 10 00:06:27 crc kubenswrapper[4994]: W0310 00:06:27.303000 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:27 crc kubenswrapper[4994]: E0310 00:06:27.303087 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.334048 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.335616 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.335652 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.335662 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.335686 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:27 crc kubenswrapper[4994]: E0310 00:06:27.336058 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.164:6443: connect: connection refused" node="crc" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.481142 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:27 crc kubenswrapper[4994]: W0310 00:06:27.510051 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:27 crc kubenswrapper[4994]: E0310 00:06:27.510123 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.558513 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9ccfe79d4dfe6da753fc95167b67ebc134a794926f1a11929e3a732af141d1a4"} Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.559502 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"28b6d77d1900211df3b731b1343563247180870acb2c2899bfb179c7793a50bd"} Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.560574 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6c46f93fa4d73cf6270f364f11272df6b45c18596808880a68c78370736f59ee"} Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.561605 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c3df1cdfdff2ff0cff1b5236b34ebc262f2d8ab395878986e7ee06a83ed10c0c"} Mar 10 00:06:27 crc kubenswrapper[4994]: I0310 00:06:27.562816 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"6858f9b978a8d205cbc7547b7638cc779b2799ed14a8bc8a7d70dff467b6f893"} Mar 10 00:06:27 crc kubenswrapper[4994]: W0310 00:06:27.700498 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:27 crc kubenswrapper[4994]: E0310 00:06:27.700637 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:27 crc kubenswrapper[4994]: W0310 00:06:27.803447 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:27 crc kubenswrapper[4994]: E0310 00:06:27.803562 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:27 crc kubenswrapper[4994]: E0310 00:06:27.888150 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="1.6s" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.136863 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.138645 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.138715 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.138738 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.138776 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:28 crc kubenswrapper[4994]: E0310 00:06:28.139298 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.164:6443: connect: connection refused" node="crc" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.433287 4994 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 00:06:28 crc kubenswrapper[4994]: E0310 00:06:28.434949 4994 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.481219 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.568933 4994 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a" exitCode=0 Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.569068 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a"} Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.569121 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.572070 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2c6d813390aa8385ab9838a01b3f678e39dde836a7084291095d4582ca467b83"} Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.572130 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.572140 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1"} Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.572164 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6"} Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.572175 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.572195 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.574769 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80" exitCode=0 Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.574834 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80"} Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.574994 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.576099 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.576146 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.576166 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.576746 4994 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7" exitCode=0 Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.576832 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7"} Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.576988 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.578040 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.578944 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.578967 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.578979 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.579105 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.579153 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.579178 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.581248 4994 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99" exitCode=0 Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.581306 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99"} Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.581396 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.583379 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.583431 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:28 crc kubenswrapper[4994]: I0310 00:06:28.583449 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:29 crc kubenswrapper[4994]: W0310 00:06:29.243049 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:29 crc kubenswrapper[4994]: E0310 00:06:29.243375 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:29 crc kubenswrapper[4994]: W0310 00:06:29.391481 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:29 crc kubenswrapper[4994]: E0310 00:06:29.391557 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.480767 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 10 00:06:29 crc kubenswrapper[4994]: E0310 00:06:29.489514 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="3.2s" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.587214 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c7cdb503feaf0be642fb32d4cac4a4ab28552f16b83033f22c9eacd90a623ce3"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.587272 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4c0b61110f8d95cb59bdf543779a8e2787b6b9b8f10548fa643d5bfb41f24d70"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.587291 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"045232a99c43b7a39857ce2e6902de25546425a59bb0f9c7b076e6f0d9629f56"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.587326 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.588435 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.588472 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.588485 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.592491 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5f52835a4ae8c3bb9dfd82d6d82a25994f8f5182fe9de997d39c5dd4561260f2"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.592606 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.593573 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.593600 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.593612 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.596425 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.596458 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.596477 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.596505 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.598241 4994 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba" exitCode=0 Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.598290 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.598420 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.599420 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.599453 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.599464 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.600478 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573"} Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.600566 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.601382 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.601425 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.601439 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.739615 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.741057 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.741100 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.741110 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.741136 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:29 crc kubenswrapper[4994]: E0310 00:06:29.741603 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.164:6443: connect: connection refused" node="crc" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.799164 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:29 crc kubenswrapper[4994]: I0310 00:06:29.809434 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.607060 4994 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010" exitCode=0 Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.607154 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010"} Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.607240 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.608814 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.608863 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.608909 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.612674 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a172cab5caa680bc0a998d64ce1c15a99e8b1d8705c147b732f9861c19ad8d13"} Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.612753 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.612808 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.612924 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.612976 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.613113 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.614353 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.614439 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.614460 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.614481 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.614525 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.614546 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.614861 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.614902 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.614916 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.615044 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.615083 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:30 crc kubenswrapper[4994]: I0310 00:06:30.615101 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.621777 4994 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.621827 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.621833 4994 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.621843 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7"} Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.621927 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.621937 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.621955 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176"} Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.621984 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637"} Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.623132 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.623198 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.623217 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.623583 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.623633 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.623651 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.623748 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.623774 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:31 crc kubenswrapper[4994]: I0310 00:06:31.623785 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.110524 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.509736 4994 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.630273 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9"} Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.630343 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04"} Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.630373 4994 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.630406 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.630443 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.632105 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.632161 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.632179 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.632400 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.632468 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.632492 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.821507 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.942005 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.944468 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.944525 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.944542 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:32 crc kubenswrapper[4994]: I0310 00:06:32.944579 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.633000 4994 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.633069 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.633015 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.634829 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.634865 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.634900 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.634971 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.634930 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.635008 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.785036 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.785323 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.786973 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.787030 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.787052 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:33 crc kubenswrapper[4994]: I0310 00:06:33.837275 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:34 crc kubenswrapper[4994]: I0310 00:06:34.635686 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:34 crc kubenswrapper[4994]: I0310 00:06:34.636847 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:34 crc kubenswrapper[4994]: I0310 00:06:34.636957 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:34 crc kubenswrapper[4994]: I0310 00:06:34.636984 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.389939 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.390200 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.391900 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.391935 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.391949 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.925810 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.926107 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.929689 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.929756 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:35 crc kubenswrapper[4994]: I0310 00:06:35.929782 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:36 crc kubenswrapper[4994]: I0310 00:06:36.292915 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:36 crc kubenswrapper[4994]: I0310 00:06:36.293137 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:36 crc kubenswrapper[4994]: I0310 00:06:36.294812 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:36 crc kubenswrapper[4994]: I0310 00:06:36.294867 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:36 crc kubenswrapper[4994]: I0310 00:06:36.294923 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:36 crc kubenswrapper[4994]: E0310 00:06:36.640851 4994 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:06:38 crc kubenswrapper[4994]: I0310 00:06:38.389863 4994 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:06:38 crc kubenswrapper[4994]: I0310 00:06:38.390005 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 00:06:40 crc kubenswrapper[4994]: W0310 00:06:40.242716 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 10 00:06:40 crc kubenswrapper[4994]: I0310 00:06:40.242925 4994 trace.go:236] Trace[475414176]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Mar-2026 00:06:30.241) (total time: 10001ms): Mar 10 00:06:40 crc kubenswrapper[4994]: Trace[475414176]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:06:40.242) Mar 10 00:06:40 crc kubenswrapper[4994]: Trace[475414176]: [10.001821248s] [10.001821248s] END Mar 10 00:06:40 crc kubenswrapper[4994]: E0310 00:06:40.242970 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 10 00:06:40 crc kubenswrapper[4994]: W0310 00:06:40.327276 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 10 00:06:40 crc kubenswrapper[4994]: I0310 00:06:40.327374 4994 trace.go:236] Trace[984513433]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Mar-2026 00:06:30.326) (total time: 10001ms): Mar 10 00:06:40 crc kubenswrapper[4994]: Trace[984513433]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:06:40.327) Mar 10 00:06:40 crc kubenswrapper[4994]: Trace[984513433]: [10.001139388s] [10.001139388s] END Mar 10 00:06:40 crc kubenswrapper[4994]: E0310 00:06:40.327397 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 10 00:06:40 crc kubenswrapper[4994]: I0310 00:06:40.481612 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 10 00:06:40 crc kubenswrapper[4994]: E0310 00:06:40.826542 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:40Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 10 00:06:40 crc kubenswrapper[4994]: W0310 00:06:40.827463 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:40Z is after 2026-02-23T05:33:13Z Mar 10 00:06:40 crc kubenswrapper[4994]: E0310 00:06:40.827528 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:40 crc kubenswrapper[4994]: E0310 00:06:40.831459 4994 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:40Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b522282d4604f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.477744207 +0000 UTC m=+0.651450996,LastTimestamp:2026-03-10 00:06:26.477744207 +0000 UTC m=+0.651450996,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:40 crc kubenswrapper[4994]: W0310 00:06:40.835311 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:40Z is after 2026-02-23T05:33:13Z Mar 10 00:06:40 crc kubenswrapper[4994]: E0310 00:06:40.835404 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:40 crc kubenswrapper[4994]: E0310 00:06:40.838095 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:40Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 00:06:40 crc kubenswrapper[4994]: I0310 00:06:40.843157 4994 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 00:06:40 crc kubenswrapper[4994]: I0310 00:06:40.843251 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 00:06:40 crc kubenswrapper[4994]: I0310 00:06:40.848830 4994 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 00:06:40 crc kubenswrapper[4994]: I0310 00:06:40.848910 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 00:06:40 crc kubenswrapper[4994]: E0310 00:06:40.858836 4994 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:41 crc kubenswrapper[4994]: I0310 00:06:41.485355 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:41Z is after 2026-02-23T05:33:13Z Mar 10 00:06:41 crc kubenswrapper[4994]: I0310 00:06:41.658016 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 00:06:41 crc kubenswrapper[4994]: I0310 00:06:41.660782 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a172cab5caa680bc0a998d64ce1c15a99e8b1d8705c147b732f9861c19ad8d13" exitCode=255 Mar 10 00:06:41 crc kubenswrapper[4994]: I0310 00:06:41.660850 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a172cab5caa680bc0a998d64ce1c15a99e8b1d8705c147b732f9861c19ad8d13"} Mar 10 00:06:41 crc kubenswrapper[4994]: I0310 00:06:41.661111 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:41 crc kubenswrapper[4994]: I0310 00:06:41.662321 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:41 crc kubenswrapper[4994]: I0310 00:06:41.662365 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:41 crc kubenswrapper[4994]: I0310 00:06:41.662384 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:41 crc kubenswrapper[4994]: I0310 00:06:41.663188 4994 scope.go:117] "RemoveContainer" containerID="a172cab5caa680bc0a998d64ce1c15a99e8b1d8705c147b732f9861c19ad8d13" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.180328 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.180601 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.181744 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.181781 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.181823 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.237211 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.486462 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:42Z is after 2026-02-23T05:33:13Z Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.666566 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.668691 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.669336 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f52a5461d5c275703eae1c974fba29c6d8fd43dca97c8db12da3652511c2a086"} Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.669440 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.670258 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.670284 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.670295 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.670988 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.671010 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.671019 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.684948 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 10 00:06:42 crc kubenswrapper[4994]: I0310 00:06:42.829916 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.486567 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:43Z is after 2026-02-23T05:33:13Z Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.678069 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.678736 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.681597 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f52a5461d5c275703eae1c974fba29c6d8fd43dca97c8db12da3652511c2a086" exitCode=255 Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.681851 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.682067 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f52a5461d5c275703eae1c974fba29c6d8fd43dca97c8db12da3652511c2a086"} Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.682195 4994 scope.go:117] "RemoveContainer" containerID="a172cab5caa680bc0a998d64ce1c15a99e8b1d8705c147b732f9861c19ad8d13" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.682282 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.683614 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.683770 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.683806 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.683775 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.683964 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.683994 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.685112 4994 scope.go:117] "RemoveContainer" containerID="f52a5461d5c275703eae1c974fba29c6d8fd43dca97c8db12da3652511c2a086" Mar 10 00:06:43 crc kubenswrapper[4994]: E0310 00:06:43.685432 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.687017 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.791502 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.791706 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.793238 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.793299 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.793320 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:43 crc kubenswrapper[4994]: I0310 00:06:43.838159 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:44 crc kubenswrapper[4994]: I0310 00:06:44.486069 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:44Z is after 2026-02-23T05:33:13Z Mar 10 00:06:44 crc kubenswrapper[4994]: I0310 00:06:44.689555 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 00:06:44 crc kubenswrapper[4994]: I0310 00:06:44.693936 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:44 crc kubenswrapper[4994]: I0310 00:06:44.695022 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:44 crc kubenswrapper[4994]: I0310 00:06:44.695069 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:44 crc kubenswrapper[4994]: I0310 00:06:44.695086 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:44 crc kubenswrapper[4994]: I0310 00:06:44.695802 4994 scope.go:117] "RemoveContainer" containerID="f52a5461d5c275703eae1c974fba29c6d8fd43dca97c8db12da3652511c2a086" Mar 10 00:06:44 crc kubenswrapper[4994]: E0310 00:06:44.696104 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:06:45 crc kubenswrapper[4994]: W0310 00:06:45.009068 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:45Z is after 2026-02-23T05:33:13Z Mar 10 00:06:45 crc kubenswrapper[4994]: E0310 00:06:45.009187 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:45 crc kubenswrapper[4994]: I0310 00:06:45.486317 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:45Z is after 2026-02-23T05:33:13Z Mar 10 00:06:45 crc kubenswrapper[4994]: W0310 00:06:45.671800 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:45Z is after 2026-02-23T05:33:13Z Mar 10 00:06:45 crc kubenswrapper[4994]: E0310 00:06:45.671925 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:45 crc kubenswrapper[4994]: I0310 00:06:45.697182 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:45 crc kubenswrapper[4994]: I0310 00:06:45.698598 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:45 crc kubenswrapper[4994]: I0310 00:06:45.698650 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:45 crc kubenswrapper[4994]: I0310 00:06:45.698669 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:45 crc kubenswrapper[4994]: I0310 00:06:45.699576 4994 scope.go:117] "RemoveContainer" containerID="f52a5461d5c275703eae1c974fba29c6d8fd43dca97c8db12da3652511c2a086" Mar 10 00:06:45 crc kubenswrapper[4994]: E0310 00:06:45.699864 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:06:46 crc kubenswrapper[4994]: I0310 00:06:46.484639 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:46Z is after 2026-02-23T05:33:13Z Mar 10 00:06:46 crc kubenswrapper[4994]: E0310 00:06:46.641147 4994 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:06:47 crc kubenswrapper[4994]: E0310 00:06:47.232471 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:47Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 00:06:47 crc kubenswrapper[4994]: I0310 00:06:47.238607 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:47 crc kubenswrapper[4994]: I0310 00:06:47.240447 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:47 crc kubenswrapper[4994]: I0310 00:06:47.240528 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:47 crc kubenswrapper[4994]: I0310 00:06:47.240548 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:47 crc kubenswrapper[4994]: I0310 00:06:47.240581 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:47 crc kubenswrapper[4994]: E0310 00:06:47.245613 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:47Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 00:06:47 crc kubenswrapper[4994]: I0310 00:06:47.484674 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:47Z is after 2026-02-23T05:33:13Z Mar 10 00:06:48 crc kubenswrapper[4994]: I0310 00:06:48.391114 4994 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:06:48 crc kubenswrapper[4994]: I0310 00:06:48.391770 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:06:48 crc kubenswrapper[4994]: I0310 00:06:48.486077 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:48Z is after 2026-02-23T05:33:13Z Mar 10 00:06:48 crc kubenswrapper[4994]: I0310 00:06:48.863409 4994 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 00:06:48 crc kubenswrapper[4994]: E0310 00:06:48.869386 4994 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:48Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:49 crc kubenswrapper[4994]: I0310 00:06:49.486547 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:49Z is after 2026-02-23T05:33:13Z Mar 10 00:06:50 crc kubenswrapper[4994]: I0310 00:06:50.485433 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:50Z is after 2026-02-23T05:33:13Z Mar 10 00:06:50 crc kubenswrapper[4994]: E0310 00:06:50.837070 4994 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:50Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b522282d4604f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.477744207 +0000 UTC m=+0.651450996,LastTimestamp:2026-03-10 00:06:26.477744207 +0000 UTC m=+0.651450996,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:51 crc kubenswrapper[4994]: W0310 00:06:51.198981 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:51Z is after 2026-02-23T05:33:13Z Mar 10 00:06:51 crc kubenswrapper[4994]: E0310 00:06:51.199082 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:51Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:51 crc kubenswrapper[4994]: I0310 00:06:51.485353 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:51Z is after 2026-02-23T05:33:13Z Mar 10 00:06:51 crc kubenswrapper[4994]: W0310 00:06:51.814407 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:51Z is after 2026-02-23T05:33:13Z Mar 10 00:06:51 crc kubenswrapper[4994]: E0310 00:06:51.814501 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:51Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:52 crc kubenswrapper[4994]: I0310 00:06:52.486142 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:52Z is after 2026-02-23T05:33:13Z Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.271076 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.271361 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.273012 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.273155 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.273176 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.274013 4994 scope.go:117] "RemoveContainer" containerID="f52a5461d5c275703eae1c974fba29c6d8fd43dca97c8db12da3652511c2a086" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.485650 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:53Z is after 2026-02-23T05:33:13Z Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.722634 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.725541 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ecf22b989e157fbfad852193a09a519c3d53cc5918f2a245644553e08338d118"} Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.725713 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.726443 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.726497 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.726515 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:53 crc kubenswrapper[4994]: I0310 00:06:53.837575 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:54 crc kubenswrapper[4994]: E0310 00:06:54.237588 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:54Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.245756 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.247150 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.247212 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.247235 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.247276 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:54 crc kubenswrapper[4994]: E0310 00:06:54.252240 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:54Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 00:06:54 crc kubenswrapper[4994]: W0310 00:06:54.456152 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:54Z is after 2026-02-23T05:33:13Z Mar 10 00:06:54 crc kubenswrapper[4994]: E0310 00:06:54.456245 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.486308 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:54Z is after 2026-02-23T05:33:13Z Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.730132 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.731070 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.733830 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ecf22b989e157fbfad852193a09a519c3d53cc5918f2a245644553e08338d118" exitCode=255 Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.733925 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ecf22b989e157fbfad852193a09a519c3d53cc5918f2a245644553e08338d118"} Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.733998 4994 scope.go:117] "RemoveContainer" containerID="f52a5461d5c275703eae1c974fba29c6d8fd43dca97c8db12da3652511c2a086" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.734110 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.735523 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.735571 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.735588 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:54 crc kubenswrapper[4994]: I0310 00:06:54.736657 4994 scope.go:117] "RemoveContainer" containerID="ecf22b989e157fbfad852193a09a519c3d53cc5918f2a245644553e08338d118" Mar 10 00:06:54 crc kubenswrapper[4994]: E0310 00:06:54.737009 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:06:55 crc kubenswrapper[4994]: I0310 00:06:55.485755 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:55Z is after 2026-02-23T05:33:13Z Mar 10 00:06:55 crc kubenswrapper[4994]: I0310 00:06:55.739577 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 00:06:55 crc kubenswrapper[4994]: I0310 00:06:55.742284 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:55 crc kubenswrapper[4994]: I0310 00:06:55.743356 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:55 crc kubenswrapper[4994]: I0310 00:06:55.743427 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:55 crc kubenswrapper[4994]: I0310 00:06:55.743487 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:55 crc kubenswrapper[4994]: I0310 00:06:55.744263 4994 scope.go:117] "RemoveContainer" containerID="ecf22b989e157fbfad852193a09a519c3d53cc5918f2a245644553e08338d118" Mar 10 00:06:55 crc kubenswrapper[4994]: E0310 00:06:55.744572 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:06:55 crc kubenswrapper[4994]: W0310 00:06:55.878337 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:55Z is after 2026-02-23T05:33:13Z Mar 10 00:06:55 crc kubenswrapper[4994]: E0310 00:06:55.878442 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:56 crc kubenswrapper[4994]: I0310 00:06:56.485542 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:56Z is after 2026-02-23T05:33:13Z Mar 10 00:06:56 crc kubenswrapper[4994]: E0310 00:06:56.641397 4994 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:06:57 crc kubenswrapper[4994]: I0310 00:06:57.485616 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:57Z is after 2026-02-23T05:33:13Z Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.391233 4994 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.391324 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.391404 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.391607 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.393037 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.393096 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.393123 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.393836 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.394120 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1" gracePeriod=30 Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.484814 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:58Z is after 2026-02-23T05:33:13Z Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.753433 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.754585 4994 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1" exitCode=255 Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.754801 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1"} Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.755025 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"558ec9f0b766ac2f80c3d721a99be111e4f7af9e09bd7880b4197525bc7d7406"} Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.755277 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.756535 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.756585 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:58 crc kubenswrapper[4994]: I0310 00:06:58.756603 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:59 crc kubenswrapper[4994]: I0310 00:06:59.487700 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:59 crc kubenswrapper[4994]: I0310 00:06:59.758054 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:59 crc kubenswrapper[4994]: I0310 00:06:59.759369 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:59 crc kubenswrapper[4994]: I0310 00:06:59.759428 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:59 crc kubenswrapper[4994]: I0310 00:06:59.759447 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:00 crc kubenswrapper[4994]: I0310 00:07:00.489099 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.846924 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b522282d4604f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.477744207 +0000 UTC m=+0.651450996,LastTimestamp:2026-03-10 00:06:26.477744207 +0000 UTC m=+0.651450996,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.854538 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228695e9c0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540759488 +0000 UTC m=+0.714466267,LastTimestamp:2026-03-10 00:06:26.540759488 +0000 UTC m=+0.714466267,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.860857 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b522286966f97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540793751 +0000 UTC m=+0.714500530,LastTimestamp:2026-03-10 00:06:26.540793751 +0000 UTC m=+0.714500530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.867966 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228696b29c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540810908 +0000 UTC m=+0.714517697,LastTimestamp:2026-03-10 00:06:26.540810908 +0000 UTC m=+0.714517697,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.875713 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228c094ff4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.632208372 +0000 UTC m=+0.805915161,LastTimestamp:2026-03-10 00:06:26.632208372 +0000 UTC m=+0.805915161,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.883910 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228695e9c0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228695e9c0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540759488 +0000 UTC m=+0.714466267,LastTimestamp:2026-03-10 00:06:26.654443816 +0000 UTC m=+0.828150585,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.887490 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b522286966f97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b522286966f97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540793751 +0000 UTC m=+0.714500530,LastTimestamp:2026-03-10 00:06:26.654464804 +0000 UTC m=+0.828171563,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.891251 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228696b29c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228696b29c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540810908 +0000 UTC m=+0.714517697,LastTimestamp:2026-03-10 00:06:26.654476809 +0000 UTC m=+0.828183568,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.893930 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228695e9c0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228695e9c0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540759488 +0000 UTC m=+0.714466267,LastTimestamp:2026-03-10 00:06:26.658586995 +0000 UTC m=+0.832293784,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.897992 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b522286966f97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b522286966f97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540793751 +0000 UTC m=+0.714500530,LastTimestamp:2026-03-10 00:06:26.658682372 +0000 UTC m=+0.832389161,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.903482 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228696b29c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228696b29c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540810908 +0000 UTC m=+0.714517697,LastTimestamp:2026-03-10 00:06:26.658701019 +0000 UTC m=+0.832407808,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.910150 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228695e9c0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228695e9c0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540759488 +0000 UTC m=+0.714466267,LastTimestamp:2026-03-10 00:06:26.658915849 +0000 UTC m=+0.832622608,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.916864 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b522286966f97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b522286966f97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540793751 +0000 UTC m=+0.714500530,LastTimestamp:2026-03-10 00:06:26.658935186 +0000 UTC m=+0.832641945,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.923561 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228696b29c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228696b29c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540810908 +0000 UTC m=+0.714517697,LastTimestamp:2026-03-10 00:06:26.65894662 +0000 UTC m=+0.832653379,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.930389 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228695e9c0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228695e9c0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540759488 +0000 UTC m=+0.714466267,LastTimestamp:2026-03-10 00:06:26.661032109 +0000 UTC m=+0.834738898,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.937402 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b522286966f97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b522286966f97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540793751 +0000 UTC m=+0.714500530,LastTimestamp:2026-03-10 00:06:26.661053297 +0000 UTC m=+0.834760086,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.943926 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228696b29c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228696b29c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540810908 +0000 UTC m=+0.714517697,LastTimestamp:2026-03-10 00:06:26.661070794 +0000 UTC m=+0.834777583,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.950417 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228695e9c0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228695e9c0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540759488 +0000 UTC m=+0.714466267,LastTimestamp:2026-03-10 00:06:26.661201743 +0000 UTC m=+0.834908512,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.957765 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b522286966f97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b522286966f97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540793751 +0000 UTC m=+0.714500530,LastTimestamp:2026-03-10 00:06:26.66122043 +0000 UTC m=+0.834927189,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.964756 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228696b29c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228696b29c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540810908 +0000 UTC m=+0.714517697,LastTimestamp:2026-03-10 00:06:26.661235215 +0000 UTC m=+0.834941974,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.971387 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228695e9c0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228695e9c0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540759488 +0000 UTC m=+0.714466267,LastTimestamp:2026-03-10 00:06:26.662481142 +0000 UTC m=+0.836187931,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.975635 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b522286966f97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b522286966f97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540793751 +0000 UTC m=+0.714500530,LastTimestamp:2026-03-10 00:06:26.66250433 +0000 UTC m=+0.836211119,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.980716 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228696b29c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228696b29c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540810908 +0000 UTC m=+0.714517697,LastTimestamp:2026-03-10 00:06:26.662521377 +0000 UTC m=+0.836228166,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.987672 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b52228695e9c0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b52228695e9c0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540759488 +0000 UTC m=+0.714466267,LastTimestamp:2026-03-10 00:06:26.662678796 +0000 UTC m=+0.836385595,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.989264 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b522286966f97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b522286966f97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:26.540793751 +0000 UTC m=+0.714500530,LastTimestamp:2026-03-10 00:06:26.662910512 +0000 UTC m=+0.836617291,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:00 crc kubenswrapper[4994]: E0310 00:07:00.994866 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b5222a603760a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.068032522 +0000 UTC m=+1.241739311,LastTimestamp:2026-03-10 00:06:27.068032522 +0000 UTC m=+1.241739311,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.000659 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b5222a603d8ab openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.068057771 +0000 UTC m=+1.241764560,LastTimestamp:2026-03-10 00:06:27.068057771 +0000 UTC m=+1.241764560,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.006553 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5222a64045bc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.072017852 +0000 UTC m=+1.245724641,LastTimestamp:2026-03-10 00:06:27.072017852 +0000 UTC m=+1.245724641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.012060 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b5222a70de7d7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.085494231 +0000 UTC m=+1.259201020,LastTimestamp:2026-03-10 00:06:27.085494231 +0000 UTC m=+1.259201020,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.015950 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222a7dfcb60 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.099249504 +0000 UTC m=+1.272956283,LastTimestamp:2026-03-10 00:06:27.099249504 +0000 UTC m=+1.272956283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.021742 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222cc3c24aa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.70928145 +0000 UTC m=+1.882988209,LastTimestamp:2026-03-10 00:06:27.70928145 +0000 UTC m=+1.882988209,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.027583 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b5222cc67177a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.712096122 +0000 UTC m=+1.885802912,LastTimestamp:2026-03-10 00:06:27.712096122 +0000 UTC m=+1.885802912,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.033209 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b5222cc6e0379 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.712549753 +0000 UTC m=+1.886256512,LastTimestamp:2026-03-10 00:06:27.712549753 +0000 UTC m=+1.886256512,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.038995 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b5222ccbbcd0c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.717647628 +0000 UTC m=+1.891354387,LastTimestamp:2026-03-10 00:06:27.717647628 +0000 UTC m=+1.891354387,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.042945 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222ccea8601 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.720709633 +0000 UTC m=+1.894416422,LastTimestamp:2026-03-10 00:06:27.720709633 +0000 UTC m=+1.894416422,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.049711 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222cd081626 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.722647078 +0000 UTC m=+1.896353837,LastTimestamp:2026-03-10 00:06:27.722647078 +0000 UTC m=+1.896353837,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.056412 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5222cd30e8d0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.725322448 +0000 UTC m=+1.899029207,LastTimestamp:2026-03-10 00:06:27.725322448 +0000 UTC m=+1.899029207,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.062845 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b5222cd4b2d2a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.727043882 +0000 UTC m=+1.900750671,LastTimestamp:2026-03-10 00:06:27.727043882 +0000 UTC m=+1.900750671,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.069773 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b5222cddeabe4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.736710116 +0000 UTC m=+1.910416875,LastTimestamp:2026-03-10 00:06:27.736710116 +0000 UTC m=+1.910416875,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.079771 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5222cdea73bf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.737482175 +0000 UTC m=+1.911188934,LastTimestamp:2026-03-10 00:06:27.737482175 +0000 UTC m=+1.911188934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.086203 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b5222ce00bdb5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.738942901 +0000 UTC m=+1.912649680,LastTimestamp:2026-03-10 00:06:27.738942901 +0000 UTC m=+1.912649680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.093380 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222e15a96d7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.063598295 +0000 UTC m=+2.237305074,LastTimestamp:2026-03-10 00:06:28.063598295 +0000 UTC m=+2.237305074,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.097316 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222e2468471 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.079060081 +0000 UTC m=+2.252766860,LastTimestamp:2026-03-10 00:06:28.079060081 +0000 UTC m=+2.252766860,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.102751 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222e25d37a4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.080547748 +0000 UTC m=+2.254254537,LastTimestamp:2026-03-10 00:06:28.080547748 +0000 UTC m=+2.254254537,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.110197 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222f1876177 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.334969207 +0000 UTC m=+2.508675986,LastTimestamp:2026-03-10 00:06:28.334969207 +0000 UTC m=+2.508675986,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.116697 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222f24ff63a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.34811449 +0000 UTC m=+2.521821279,LastTimestamp:2026-03-10 00:06:28.34811449 +0000 UTC m=+2.521821279,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.123035 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222f267e56a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.34968305 +0000 UTC m=+2.523389839,LastTimestamp:2026-03-10 00:06:28.34968305 +0000 UTC m=+2.523389839,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.130727 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b5222ffc90f4c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.574154572 +0000 UTC m=+2.747861351,LastTimestamp:2026-03-10 00:06:28.574154572 +0000 UTC m=+2.747861351,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.139464 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b522300021f94 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.577894292 +0000 UTC m=+2.751601051,LastTimestamp:2026-03-10 00:06:28.577894292 +0000 UTC m=+2.751601051,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.148300 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223003d0b7b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.581755771 +0000 UTC m=+2.755462560,LastTimestamp:2026-03-10 00:06:28.581755771 +0000 UTC m=+2.755462560,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.156719 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b5223008e75ea openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.587091434 +0000 UTC m=+2.760798223,LastTimestamp:2026-03-10 00:06:28.587091434 +0000 UTC m=+2.760798223,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.163181 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b52230214c216 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.612669974 +0000 UTC m=+2.786376763,LastTimestamp:2026-03-10 00:06:28.612669974 +0000 UTC m=+2.786376763,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.170107 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b522303db9a7e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.642478718 +0000 UTC m=+2.816185517,LastTimestamp:2026-03-10 00:06:28.642478718 +0000 UTC m=+2.816185517,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.176725 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b522310bf5509 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.858729737 +0000 UTC m=+3.032436496,LastTimestamp:2026-03-10 00:06:28.858729737 +0000 UTC m=+3.032436496,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.183120 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b522310dca24c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.86065006 +0000 UTC m=+3.034356819,LastTimestamp:2026-03-10 00:06:28.86065006 +0000 UTC m=+3.034356819,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.192015 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b522310dd5941 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.860696897 +0000 UTC m=+3.034403656,LastTimestamp:2026-03-10 00:06:28.860696897 +0000 UTC m=+3.034403656,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.200126 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b5223119d75cd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.873287117 +0000 UTC m=+3.046993876,LastTimestamp:2026-03-10 00:06:28.873287117 +0000 UTC m=+3.046993876,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.201542 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b522311bd3374 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.875367284 +0000 UTC m=+3.049074043,LastTimestamp:2026-03-10 00:06:28.875367284 +0000 UTC m=+3.049074043,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.207810 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b522311cab733 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.876252979 +0000 UTC m=+3.049959738,LastTimestamp:2026-03-10 00:06:28.876252979 +0000 UTC m=+3.049959738,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.214224 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b522311d61298 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.876997272 +0000 UTC m=+3.050704031,LastTimestamp:2026-03-10 00:06:28.876997272 +0000 UTC m=+3.050704031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.218333 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b522311d94bf4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.877208564 +0000 UTC m=+3.050915323,LastTimestamp:2026-03-10 00:06:28.877208564 +0000 UTC m=+3.050915323,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.224661 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b522311e25e56 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.877803094 +0000 UTC m=+3.051509853,LastTimestamp:2026-03-10 00:06:28.877803094 +0000 UTC m=+3.051509853,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.230538 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b522313030083 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.896718979 +0000 UTC m=+3.070425758,LastTimestamp:2026-03-10 00:06:28.896718979 +0000 UTC m=+3.070425758,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.237077 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b52231fb28882 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.109549186 +0000 UTC m=+3.283255945,LastTimestamp:2026-03-10 00:06:29.109549186 +0000 UTC m=+3.283255945,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.246972 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52231fb617ef openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.109782511 +0000 UTC m=+3.283489260,LastTimestamp:2026-03-10 00:06:29.109782511 +0000 UTC m=+3.283489260,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.247602 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 00:07:01 crc kubenswrapper[4994]: I0310 00:07:01.252786 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.253071 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b522320903add openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.124078301 +0000 UTC m=+3.297785050,LastTimestamp:2026-03-10 00:06:29.124078301 +0000 UTC m=+3.297785050,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: I0310 00:07:01.255366 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:01 crc kubenswrapper[4994]: I0310 00:07:01.255424 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:01 crc kubenswrapper[4994]: I0310 00:07:01.255442 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:01 crc kubenswrapper[4994]: I0310 00:07:01.255478 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.259766 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.259916 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b5223209f18ae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.12505259 +0000 UTC m=+3.298759339,LastTimestamp:2026-03-10 00:06:29.12505259 +0000 UTC m=+3.298759339,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.262989 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b522320dd5485 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.129131141 +0000 UTC m=+3.302837900,LastTimestamp:2026-03-10 00:06:29.129131141 +0000 UTC m=+3.302837900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.266251 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b522320fe4fff openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.131292671 +0000 UTC m=+3.304999420,LastTimestamp:2026-03-10 00:06:29.131292671 +0000 UTC m=+3.304999420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.270494 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52232d23d6fa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.33507865 +0000 UTC m=+3.508785409,LastTimestamp:2026-03-10 00:06:29.33507865 +0000 UTC m=+3.508785409,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.272825 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b52232d4b61e3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.337670115 +0000 UTC m=+3.511376874,LastTimestamp:2026-03-10 00:06:29.337670115 +0000 UTC m=+3.511376874,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.279094 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52232e8d0c80 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.358750848 +0000 UTC m=+3.532457607,LastTimestamp:2026-03-10 00:06:29.358750848 +0000 UTC m=+3.532457607,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.285851 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52232e9fe449 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.359985737 +0000 UTC m=+3.533692526,LastTimestamp:2026-03-10 00:06:29.359985737 +0000 UTC m=+3.533692526,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.292480 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b52232ebd1038 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.361897528 +0000 UTC m=+3.535604287,LastTimestamp:2026-03-10 00:06:29.361897528 +0000 UTC m=+3.535604287,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.299120 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52233a447c01 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.555321857 +0000 UTC m=+3.729028616,LastTimestamp:2026-03-10 00:06:29.555321857 +0000 UTC m=+3.729028616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.306945 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52233b2da79b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.570602907 +0000 UTC m=+3.744309676,LastTimestamp:2026-03-10 00:06:29.570602907 +0000 UTC m=+3.744309676,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.312346 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52233b432679 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.572011641 +0000 UTC m=+3.745718430,LastTimestamp:2026-03-10 00:06:29.572011641 +0000 UTC m=+3.745718430,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.320483 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b52233cf916d6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.600712406 +0000 UTC m=+3.774419165,LastTimestamp:2026-03-10 00:06:29.600712406 +0000 UTC m=+3.774419165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.327266 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52234826974b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.788243787 +0000 UTC m=+3.961950536,LastTimestamp:2026-03-10 00:06:29.788243787 +0000 UTC m=+3.961950536,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.334079 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b522348fe1ae8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.80236772 +0000 UTC m=+3.976074469,LastTimestamp:2026-03-10 00:06:29.80236772 +0000 UTC m=+3.976074469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.340573 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b52234a5c7918 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.825329432 +0000 UTC m=+3.999036201,LastTimestamp:2026-03-10 00:06:29.825329432 +0000 UTC m=+3.999036201,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.346759 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b52234b121bb4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.837233076 +0000 UTC m=+4.010939845,LastTimestamp:2026-03-10 00:06:29.837233076 +0000 UTC m=+4.010939845,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.353436 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223792fbca8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:30.61092676 +0000 UTC m=+4.784633539,LastTimestamp:2026-03-10 00:06:30.61092676 +0000 UTC m=+4.784633539,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.361571 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b522389f6a077 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:30.892396663 +0000 UTC m=+5.066103442,LastTimestamp:2026-03-10 00:06:30.892396663 +0000 UTC m=+5.066103442,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.367799 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b52238ab6e4b3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:30.904997043 +0000 UTC m=+5.078703832,LastTimestamp:2026-03-10 00:06:30.904997043 +0000 UTC m=+5.078703832,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.373578 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b52238accbb20 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:30.906428192 +0000 UTC m=+5.080134971,LastTimestamp:2026-03-10 00:06:30.906428192 +0000 UTC m=+5.080134971,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.378386 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223998e4ff5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.153995765 +0000 UTC m=+5.327702544,LastTimestamp:2026-03-10 00:06:31.153995765 +0000 UTC m=+5.327702544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.384709 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b52239a7b3519 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.169520921 +0000 UTC m=+5.343227710,LastTimestamp:2026-03-10 00:06:31.169520921 +0000 UTC m=+5.343227710,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.391335 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b52239a9691ff openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.171314175 +0000 UTC m=+5.345020954,LastTimestamp:2026-03-10 00:06:31.171314175 +0000 UTC m=+5.345020954,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.397315 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223aa5669d8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.435545048 +0000 UTC m=+5.609251827,LastTimestamp:2026-03-10 00:06:31.435545048 +0000 UTC m=+5.609251827,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.404186 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223ab4d1e63 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.451713123 +0000 UTC m=+5.625419912,LastTimestamp:2026-03-10 00:06:31.451713123 +0000 UTC m=+5.625419912,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.410344 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223ab61328d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.453029005 +0000 UTC m=+5.626735784,LastTimestamp:2026-03-10 00:06:31.453029005 +0000 UTC m=+5.626735784,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.417707 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223bb98ae27 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.725100583 +0000 UTC m=+5.898807372,LastTimestamp:2026-03-10 00:06:31.725100583 +0000 UTC m=+5.898807372,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.423939 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223bc7d2dc0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.740075456 +0000 UTC m=+5.913782235,LastTimestamp:2026-03-10 00:06:31.740075456 +0000 UTC m=+5.913782235,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.430115 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223bc90306f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.741321327 +0000 UTC m=+5.915028116,LastTimestamp:2026-03-10 00:06:31.741321327 +0000 UTC m=+5.915028116,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.436523 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223cbc513ac openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:31.996445612 +0000 UTC m=+6.170152361,LastTimestamp:2026-03-10 00:06:31.996445612 +0000 UTC m=+6.170152361,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.443149 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5223cc711d09 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:32.007720201 +0000 UTC m=+6.181426940,LastTimestamp:2026-03-10 00:06:32.007720201 +0000 UTC m=+6.181426940,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.453084 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 00:07:01 crc kubenswrapper[4994]: &Event{ObjectMeta:{kube-controller-manager-crc.189b522548da78f9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 10 00:07:01 crc kubenswrapper[4994]: body: Mar 10 00:07:01 crc kubenswrapper[4994]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:38.389967097 +0000 UTC m=+12.563673886,LastTimestamp:2026-03-10 00:06:38.389967097 +0000 UTC m=+12.563673886,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:07:01 crc kubenswrapper[4994]: > Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.459749 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b522548dbd921 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:38.390057249 +0000 UTC m=+12.563764028,LastTimestamp:2026-03-10 00:06:38.390057249 +0000 UTC m=+12.563764028,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.466451 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 00:07:01 crc kubenswrapper[4994]: &Event{ObjectMeta:{kube-apiserver-crc.189b5225db1412d5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 00:07:01 crc kubenswrapper[4994]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 00:07:01 crc kubenswrapper[4994]: Mar 10 00:07:01 crc kubenswrapper[4994]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:40.843215573 +0000 UTC m=+15.016922332,LastTimestamp:2026-03-10 00:06:40.843215573 +0000 UTC m=+15.016922332,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:07:01 crc kubenswrapper[4994]: > Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.473065 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b5225db152a55 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:40.843287125 +0000 UTC m=+15.016993894,LastTimestamp:2026-03-10 00:06:40.843287125 +0000 UTC m=+15.016993894,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.479312 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b5225db1412d5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 00:07:01 crc kubenswrapper[4994]: &Event{ObjectMeta:{kube-apiserver-crc.189b5225db1412d5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 00:07:01 crc kubenswrapper[4994]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 00:07:01 crc kubenswrapper[4994]: Mar 10 00:07:01 crc kubenswrapper[4994]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:40.843215573 +0000 UTC m=+15.016922332,LastTimestamp:2026-03-10 00:06:40.848893498 +0000 UTC m=+15.022600257,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:07:01 crc kubenswrapper[4994]: > Mar 10 00:07:01 crc kubenswrapper[4994]: I0310 00:07:01.486182 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.486282 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b5225db152a55\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b5225db152a55 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:40.843287125 +0000 UTC m=+15.016993894,LastTimestamp:2026-03-10 00:06:40.84893639 +0000 UTC m=+15.022643149,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.489719 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b52233b432679\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52233b432679 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.572011641 +0000 UTC m=+3.745718430,LastTimestamp:2026-03-10 00:06:41.664645497 +0000 UTC m=+15.838352286,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.493859 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b52234826974b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52234826974b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.788243787 +0000 UTC m=+3.961950536,LastTimestamp:2026-03-10 00:06:41.960231794 +0000 UTC m=+16.133938563,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.495998 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b522348fe1ae8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b522348fe1ae8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.80236772 +0000 UTC m=+3.976074469,LastTimestamp:2026-03-10 00:06:41.972024858 +0000 UTC m=+16.145731617,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.500989 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 00:07:01 crc kubenswrapper[4994]: &Event{ObjectMeta:{kube-controller-manager-crc.189b52279d00e9f0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 00:07:01 crc kubenswrapper[4994]: body: Mar 10 00:07:01 crc kubenswrapper[4994]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:48.39170712 +0000 UTC m=+22.565413909,LastTimestamp:2026-03-10 00:06:48.39170712 +0000 UTC m=+22.565413909,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:07:01 crc kubenswrapper[4994]: > Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.504972 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b52279d02b035 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:48.391823413 +0000 UTC m=+22.565530202,LastTimestamp:2026-03-10 00:06:48.391823413 +0000 UTC m=+22.565530202,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.508662 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b52279d00e9f0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 00:07:01 crc kubenswrapper[4994]: &Event{ObjectMeta:{kube-controller-manager-crc.189b52279d00e9f0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 00:07:01 crc kubenswrapper[4994]: body: Mar 10 00:07:01 crc kubenswrapper[4994]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:48.39170712 +0000 UTC m=+22.565413909,LastTimestamp:2026-03-10 00:06:58.391302984 +0000 UTC m=+32.565009773,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:07:01 crc kubenswrapper[4994]: > Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.515137 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b52279d02b035\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b52279d02b035 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:48.391823413 +0000 UTC m=+22.565530202,LastTimestamp:2026-03-10 00:06:58.391362966 +0000 UTC m=+32.565069745,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.519411 4994 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5229f1313a33 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:58.394094131 +0000 UTC m=+32.567800910,LastTimestamp:2026-03-10 00:06:58.394094131 +0000 UTC m=+32.567800910,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.521663 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b5222cd081626\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222cd081626 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:27.722647078 +0000 UTC m=+1.896353837,LastTimestamp:2026-03-10 00:06:58.516042376 +0000 UTC m=+32.689749125,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.527022 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b5222e15a96d7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222e15a96d7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.063598295 +0000 UTC m=+2.237305074,LastTimestamp:2026-03-10 00:06:58.688794086 +0000 UTC m=+32.862500835,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:01 crc kubenswrapper[4994]: E0310 00:07:01.532517 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b5222e2468471\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5222e2468471 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.079060081 +0000 UTC m=+2.252766860,LastTimestamp:2026-03-10 00:06:58.700492656 +0000 UTC m=+32.874199405,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:02 crc kubenswrapper[4994]: I0310 00:07:02.483151 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:03 crc kubenswrapper[4994]: I0310 00:07:03.270146 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:07:03 crc kubenswrapper[4994]: I0310 00:07:03.270446 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:03 crc kubenswrapper[4994]: I0310 00:07:03.272197 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:03 crc kubenswrapper[4994]: I0310 00:07:03.272429 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:03 crc kubenswrapper[4994]: I0310 00:07:03.272572 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:03 crc kubenswrapper[4994]: I0310 00:07:03.273556 4994 scope.go:117] "RemoveContainer" containerID="ecf22b989e157fbfad852193a09a519c3d53cc5918f2a245644553e08338d118" Mar 10 00:07:03 crc kubenswrapper[4994]: E0310 00:07:03.274028 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:07:03 crc kubenswrapper[4994]: I0310 00:07:03.487009 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:04 crc kubenswrapper[4994]: I0310 00:07:04.487723 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:05 crc kubenswrapper[4994]: I0310 00:07:05.390215 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:07:05 crc kubenswrapper[4994]: I0310 00:07:05.390491 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:05 crc kubenswrapper[4994]: I0310 00:07:05.392058 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:05 crc kubenswrapper[4994]: I0310 00:07:05.392112 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:05 crc kubenswrapper[4994]: I0310 00:07:05.392132 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:05 crc kubenswrapper[4994]: I0310 00:07:05.488438 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:05 crc kubenswrapper[4994]: I0310 00:07:05.600856 4994 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 00:07:05 crc kubenswrapper[4994]: I0310 00:07:05.621597 4994 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 00:07:06 crc kubenswrapper[4994]: I0310 00:07:06.293261 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:07:06 crc kubenswrapper[4994]: I0310 00:07:06.293525 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:06 crc kubenswrapper[4994]: I0310 00:07:06.295314 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:06 crc kubenswrapper[4994]: I0310 00:07:06.295354 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:06 crc kubenswrapper[4994]: I0310 00:07:06.295364 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:06 crc kubenswrapper[4994]: W0310 00:07:06.301395 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 10 00:07:06 crc kubenswrapper[4994]: E0310 00:07:06.301463 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 00:07:06 crc kubenswrapper[4994]: I0310 00:07:06.487806 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:06 crc kubenswrapper[4994]: E0310 00:07:06.641675 4994 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:07:07 crc kubenswrapper[4994]: I0310 00:07:07.488494 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:07 crc kubenswrapper[4994]: W0310 00:07:07.589442 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 10 00:07:07 crc kubenswrapper[4994]: E0310 00:07:07.589525 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 00:07:08 crc kubenswrapper[4994]: E0310 00:07:08.255682 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 00:07:08 crc kubenswrapper[4994]: I0310 00:07:08.260845 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:08 crc kubenswrapper[4994]: I0310 00:07:08.262664 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:08 crc kubenswrapper[4994]: I0310 00:07:08.262724 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:08 crc kubenswrapper[4994]: I0310 00:07:08.262744 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:08 crc kubenswrapper[4994]: I0310 00:07:08.262783 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:07:08 crc kubenswrapper[4994]: E0310 00:07:08.269147 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 00:07:08 crc kubenswrapper[4994]: I0310 00:07:08.391124 4994 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:07:08 crc kubenswrapper[4994]: I0310 00:07:08.391207 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:07:08 crc kubenswrapper[4994]: E0310 00:07:08.398096 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b52279d00e9f0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 00:07:08 crc kubenswrapper[4994]: &Event{ObjectMeta:{kube-controller-manager-crc.189b52279d00e9f0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 00:07:08 crc kubenswrapper[4994]: body: Mar 10 00:07:08 crc kubenswrapper[4994]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:48.39170712 +0000 UTC m=+22.565413909,LastTimestamp:2026-03-10 00:07:08.391186487 +0000 UTC m=+42.564893276,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:07:08 crc kubenswrapper[4994]: > Mar 10 00:07:08 crc kubenswrapper[4994]: E0310 00:07:08.405940 4994 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b52279d02b035\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b52279d02b035 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:48.391823413 +0000 UTC m=+22.565530202,LastTimestamp:2026-03-10 00:07:08.391239678 +0000 UTC m=+42.564946457,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:07:08 crc kubenswrapper[4994]: I0310 00:07:08.489552 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:09 crc kubenswrapper[4994]: I0310 00:07:09.482739 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:09 crc kubenswrapper[4994]: W0310 00:07:09.798972 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 10 00:07:09 crc kubenswrapper[4994]: E0310 00:07:09.799053 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 00:07:10 crc kubenswrapper[4994]: I0310 00:07:10.487562 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:10 crc kubenswrapper[4994]: W0310 00:07:10.743495 4994 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:10 crc kubenswrapper[4994]: E0310 00:07:10.743581 4994 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 00:07:11 crc kubenswrapper[4994]: I0310 00:07:11.483704 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:12 crc kubenswrapper[4994]: I0310 00:07:12.483960 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:13 crc kubenswrapper[4994]: I0310 00:07:13.488082 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:14 crc kubenswrapper[4994]: I0310 00:07:14.487987 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:14 crc kubenswrapper[4994]: I0310 00:07:14.553767 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:14 crc kubenswrapper[4994]: I0310 00:07:14.556113 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:14 crc kubenswrapper[4994]: I0310 00:07:14.556187 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:14 crc kubenswrapper[4994]: I0310 00:07:14.556206 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:14 crc kubenswrapper[4994]: I0310 00:07:14.557125 4994 scope.go:117] "RemoveContainer" containerID="ecf22b989e157fbfad852193a09a519c3d53cc5918f2a245644553e08338d118" Mar 10 00:07:15 crc kubenswrapper[4994]: E0310 00:07:15.262505 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.269592 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.270914 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.270940 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.270952 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.270979 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:07:15 crc kubenswrapper[4994]: E0310 00:07:15.277315 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.397553 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.397703 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.398693 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.398722 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.398731 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.401775 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.487222 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.804609 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.805416 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.808270 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4" exitCode=255 Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.808370 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.808343 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4"} Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.808446 4994 scope.go:117] "RemoveContainer" containerID="ecf22b989e157fbfad852193a09a519c3d53cc5918f2a245644553e08338d118" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.808675 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.809265 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.809297 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.809308 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.810283 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.810310 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.810319 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:15 crc kubenswrapper[4994]: I0310 00:07:15.810665 4994 scope.go:117] "RemoveContainer" containerID="4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4" Mar 10 00:07:15 crc kubenswrapper[4994]: E0310 00:07:15.810800 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:07:16 crc kubenswrapper[4994]: I0310 00:07:16.486655 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:16 crc kubenswrapper[4994]: E0310 00:07:16.641791 4994 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:07:16 crc kubenswrapper[4994]: I0310 00:07:16.812975 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 00:07:17 crc kubenswrapper[4994]: I0310 00:07:17.183971 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:07:17 crc kubenswrapper[4994]: I0310 00:07:17.184256 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:17 crc kubenswrapper[4994]: I0310 00:07:17.185784 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:17 crc kubenswrapper[4994]: I0310 00:07:17.185852 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:17 crc kubenswrapper[4994]: I0310 00:07:17.185913 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:17 crc kubenswrapper[4994]: I0310 00:07:17.487625 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:18 crc kubenswrapper[4994]: I0310 00:07:18.487055 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:19 crc kubenswrapper[4994]: I0310 00:07:19.493192 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:20 crc kubenswrapper[4994]: I0310 00:07:20.487478 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:21 crc kubenswrapper[4994]: I0310 00:07:21.487755 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:22 crc kubenswrapper[4994]: E0310 00:07:22.270253 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 00:07:22 crc kubenswrapper[4994]: I0310 00:07:22.277625 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:22 crc kubenswrapper[4994]: I0310 00:07:22.279150 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:22 crc kubenswrapper[4994]: I0310 00:07:22.279218 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:22 crc kubenswrapper[4994]: I0310 00:07:22.279242 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:22 crc kubenswrapper[4994]: I0310 00:07:22.279283 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:07:22 crc kubenswrapper[4994]: E0310 00:07:22.286050 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 00:07:22 crc kubenswrapper[4994]: I0310 00:07:22.490776 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.271128 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.271628 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.273326 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.273542 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.273785 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.274916 4994 scope.go:117] "RemoveContainer" containerID="4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4" Mar 10 00:07:23 crc kubenswrapper[4994]: E0310 00:07:23.275320 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.488083 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.837599 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.837817 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.839119 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.839184 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.839202 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:23 crc kubenswrapper[4994]: I0310 00:07:23.840095 4994 scope.go:117] "RemoveContainer" containerID="4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4" Mar 10 00:07:23 crc kubenswrapper[4994]: E0310 00:07:23.840373 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:07:24 crc kubenswrapper[4994]: I0310 00:07:24.487418 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:25 crc kubenswrapper[4994]: I0310 00:07:25.487940 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:26 crc kubenswrapper[4994]: I0310 00:07:26.487329 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:26 crc kubenswrapper[4994]: E0310 00:07:26.642200 4994 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:07:27 crc kubenswrapper[4994]: I0310 00:07:27.487438 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:28 crc kubenswrapper[4994]: I0310 00:07:28.489182 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:29 crc kubenswrapper[4994]: E0310 00:07:29.277357 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 00:07:29 crc kubenswrapper[4994]: I0310 00:07:29.286535 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:29 crc kubenswrapper[4994]: I0310 00:07:29.288927 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:29 crc kubenswrapper[4994]: I0310 00:07:29.288987 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:29 crc kubenswrapper[4994]: I0310 00:07:29.289002 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:29 crc kubenswrapper[4994]: I0310 00:07:29.289030 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:07:29 crc kubenswrapper[4994]: E0310 00:07:29.296168 4994 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 00:07:29 crc kubenswrapper[4994]: I0310 00:07:29.487665 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:30 crc kubenswrapper[4994]: I0310 00:07:30.488175 4994 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:30 crc kubenswrapper[4994]: I0310 00:07:30.963851 4994 csr.go:261] certificate signing request csr-dn7g8 is approved, waiting to be issued Mar 10 00:07:30 crc kubenswrapper[4994]: I0310 00:07:30.973076 4994 csr.go:257] certificate signing request csr-dn7g8 is issued Mar 10 00:07:30 crc kubenswrapper[4994]: I0310 00:07:30.992035 4994 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 10 00:07:31 crc kubenswrapper[4994]: I0310 00:07:31.320842 4994 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 10 00:07:31 crc kubenswrapper[4994]: I0310 00:07:31.974528 4994 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-27 19:39:16.68495999 +0000 UTC Mar 10 00:07:31 crc kubenswrapper[4994]: I0310 00:07:31.974588 4994 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7027h31m44.710377243s for next certificate rotation Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.297145 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.298584 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.298644 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.298667 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.298860 4994 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.312709 4994 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.313197 4994 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.313257 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.318575 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.318615 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.318626 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.318642 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.318654 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:36Z","lastTransitionTime":"2026-03-10T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.334803 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.343426 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.343476 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.343494 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.343517 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.343536 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:36Z","lastTransitionTime":"2026-03-10T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.357716 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.365197 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.365238 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.365251 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.365267 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.365280 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:36Z","lastTransitionTime":"2026-03-10T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.377825 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.386006 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.386056 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.386075 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.386097 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:36 crc kubenswrapper[4994]: I0310 00:07:36.386112 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:36Z","lastTransitionTime":"2026-03-10T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.401356 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.401611 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.401652 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.502324 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.603468 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.642448 4994 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.704134 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.805230 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:36 crc kubenswrapper[4994]: E0310 00:07:36.906258 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.007355 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.108240 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.209054 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.310079 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.410474 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.511612 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.612499 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.712845 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.813967 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:37 crc kubenswrapper[4994]: E0310 00:07:37.914837 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.015851 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.117194 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.218413 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.319526 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.421321 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.521492 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.622117 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.722278 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.823419 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:38 crc kubenswrapper[4994]: E0310 00:07:38.923945 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.024458 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.125102 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.226227 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.326391 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.426528 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.527701 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: I0310 00:07:39.553602 4994 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:39 crc kubenswrapper[4994]: I0310 00:07:39.555209 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:39 crc kubenswrapper[4994]: I0310 00:07:39.555261 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:39 crc kubenswrapper[4994]: I0310 00:07:39.555277 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:39 crc kubenswrapper[4994]: I0310 00:07:39.556186 4994 scope.go:117] "RemoveContainer" containerID="4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.556497 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.628585 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.729098 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.829486 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:39 crc kubenswrapper[4994]: E0310 00:07:39.930534 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.031916 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.133147 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.233656 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.334735 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.436221 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.537432 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.637983 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.738501 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.839070 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:40 crc kubenswrapper[4994]: I0310 00:07:40.904788 4994 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 00:07:40 crc kubenswrapper[4994]: E0310 00:07:40.940066 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.040480 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.141260 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.242606 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.342957 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.443109 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.544092 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.644783 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.745163 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.846104 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:41 crc kubenswrapper[4994]: E0310 00:07:41.946280 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.046833 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.147120 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.247383 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.348489 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.449794 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.550646 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.651325 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.752009 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.852365 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:42 crc kubenswrapper[4994]: E0310 00:07:42.952677 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.053244 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.154284 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.254683 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.355794 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.455974 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.556291 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.657148 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.757591 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.857999 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:43 crc kubenswrapper[4994]: E0310 00:07:43.959119 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.060417 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.160609 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.261523 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.362011 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.463218 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.563631 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.664602 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.765320 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.865487 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:44 crc kubenswrapper[4994]: E0310 00:07:44.966361 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.066499 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.167495 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.268803 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.369668 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.471378 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.571727 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.671934 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.773113 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.874282 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:45 crc kubenswrapper[4994]: E0310 00:07:45.974405 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.075184 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.175377 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.276054 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.376552 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.476642 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.517090 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.522597 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.522643 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.522660 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.522682 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.522702 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:46Z","lastTransitionTime":"2026-03-10T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.539473 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.545562 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.545637 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.545659 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.546158 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.546226 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:46Z","lastTransitionTime":"2026-03-10T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.563319 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.568901 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.568962 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.568980 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.569391 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.569443 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:46Z","lastTransitionTime":"2026-03-10T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.585034 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.590454 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.590521 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.590548 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.590581 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:46 crc kubenswrapper[4994]: I0310 00:07:46.590606 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:46Z","lastTransitionTime":"2026-03-10T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.605475 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.605691 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.605724 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.643539 4994 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.706107 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.807216 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:46 crc kubenswrapper[4994]: E0310 00:07:46.908131 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:47 crc kubenswrapper[4994]: E0310 00:07:47.008718 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:47 crc kubenswrapper[4994]: E0310 00:07:47.109605 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:47 crc kubenswrapper[4994]: E0310 00:07:47.210707 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:47 crc kubenswrapper[4994]: E0310 00:07:47.311741 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:47 crc kubenswrapper[4994]: E0310 00:07:47.412046 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:47 crc kubenswrapper[4994]: E0310 00:07:47.512924 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:47 crc kubenswrapper[4994]: E0310 00:07:47.613634 4994 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.655237 4994 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.717590 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.717636 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.717653 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.717678 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.717695 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:47Z","lastTransitionTime":"2026-03-10T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.820801 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.820856 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.820913 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.820941 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.820959 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:47Z","lastTransitionTime":"2026-03-10T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.924124 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.924184 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.924202 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.924228 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:47 crc kubenswrapper[4994]: I0310 00:07:47.924247 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:47Z","lastTransitionTime":"2026-03-10T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.028460 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.028859 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.029060 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.029227 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.029401 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.132786 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.132850 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.132900 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.132927 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.132949 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.236143 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.236472 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.236607 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.236739 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.236904 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.340291 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.340356 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.340378 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.340407 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.340428 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.443844 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.443916 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.443928 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.443946 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.443958 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.513967 4994 apiserver.go:52] "Watching apiserver" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.521395 4994 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.522056 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jhp6z","openshift-multus/multus-mcxcb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-target-xd92c","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn","openshift-dns/node-resolver-24l69","openshift-machine-config-operator/machine-config-daemon-kfljj","openshift-multus/multus-additional-cni-plugins-b2f6h","openshift-multus/network-metrics-daemon-vxjt2","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-ovn-kubernetes/ovnkube-node-ns797","openshift-network-node-identity/network-node-identity-vrzqb"] Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.522535 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.522775 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.523200 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.523239 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.523264 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.523673 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.523751 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.524275 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.524511 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.524555 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-24l69" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.524605 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.525464 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.524417 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.525368 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.526066 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.524628 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.526996 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.527967 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.537141 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.537240 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.537494 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.537977 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.538165 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.538267 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.538415 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.538477 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.538922 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.539484 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.542127 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.542168 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.542445 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.542630 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.542707 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.542765 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.542973 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543026 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543067 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543231 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543308 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543327 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543497 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543587 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543711 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543593 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.543913 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.544013 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.542466 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.544267 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.544317 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.544483 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.544709 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.545018 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.545655 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.545773 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.545132 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.547202 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.547248 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.547272 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.547304 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.547330 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.570077 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.584353 4994 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.584364 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599376 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599417 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599444 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599466 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599487 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599508 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599529 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599549 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599573 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599594 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599615 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599634 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599654 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599673 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599692 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599714 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599737 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599841 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599868 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599918 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599939 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599962 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.599982 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600006 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600027 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600048 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600068 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600092 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600111 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600132 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600154 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600176 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600198 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600219 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600244 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600267 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600291 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600311 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600333 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600355 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600375 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600395 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600416 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600438 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600458 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600478 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600501 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600522 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600542 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600562 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600583 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600606 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600627 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600650 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600672 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600692 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600712 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600732 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600753 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600775 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600795 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600818 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600839 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600862 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600911 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600933 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600954 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600976 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601004 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.600994 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601028 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601140 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601202 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601359 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601731 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601798 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601851 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601938 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.601991 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.602189 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:07:49.102025767 +0000 UTC m=+83.275732616 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.602257 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.602318 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.602359 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.602640 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.602740 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.603064 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.603341 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.603764 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.604123 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.604209 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.604844 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.603369 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.603361 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.605319 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.605359 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.605283 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.605708 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.606038 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.606171 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.606470 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.606703 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.607544 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.607567 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.607939 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.608662 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.608722 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.608751 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.608766 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.608823 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.609046 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.609068 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.609410 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.610340 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.610730 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.610767 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.610983 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.611542 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.611556 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.611605 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.611619 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.603517 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.611744 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.611834 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.611925 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.611963 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612012 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612061 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612116 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612166 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612232 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612277 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612387 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612435 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612470 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612501 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612533 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612567 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612600 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612637 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612674 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612708 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612742 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612777 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612821 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612867 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612992 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613031 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613113 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613214 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613253 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613291 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613330 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613367 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613402 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613437 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613473 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613510 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613547 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613583 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613622 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613657 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613694 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613728 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613800 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613838 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613914 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613953 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613993 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614031 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614069 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614108 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614146 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614181 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614220 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614275 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614328 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614366 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614407 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614443 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612067 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612064 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612239 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612293 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612312 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612432 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612755 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.612955 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613467 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613473 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613679 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.613757 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614190 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614198 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614530 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.615171 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.615753 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.615776 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.616400 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.616476 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.616855 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.617191 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.617766 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.617807 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.618414 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.618714 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.614539 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619101 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619152 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619193 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619231 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619265 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619309 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619350 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619380 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619418 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619449 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619485 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619515 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619544 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619572 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619601 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619632 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619693 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619725 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619757 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619790 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619826 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619857 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619918 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619959 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619991 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620020 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620055 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620088 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620129 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620161 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620190 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620223 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620284 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620327 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620364 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620405 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620435 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620467 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620501 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620538 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620570 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620603 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620635 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620665 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620698 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620738 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620769 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620800 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620915 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620953 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620985 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621017 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621052 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621084 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621116 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621158 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621296 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621349 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-bin\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621381 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-netd\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621411 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovn-node-metrics-cert\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621448 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-cni-dir\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621480 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621556 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-var-lib-cni-bin\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621609 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621691 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621731 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-socket-dir-parent\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621763 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-env-overrides\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621800 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621838 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621893 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-etc-openvswitch\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621929 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621976 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622016 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwkj5\" (UniqueName: \"kubernetes.io/projected/7c9fd1f0-58d6-4986-86b5-8c26c871e79b-kube-api-access-hwkj5\") pod \"node-ca-jhp6z\" (UID: \"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\") " pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622051 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ced5d66d-39df-4267-b801-e1e60d517ace-mcd-auth-proxy-config\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622089 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r5sl\" (UniqueName: \"kubernetes.io/projected/ced5d66d-39df-4267-b801-e1e60d517ace-kube-api-access-9r5sl\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622118 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-log-socket\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622148 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-env-overrides\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622179 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dkqt\" (UniqueName: \"kubernetes.io/projected/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-kube-api-access-7dkqt\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622209 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-var-lib-cni-multus\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622240 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfsnl\" (UniqueName: \"kubernetes.io/projected/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-kube-api-access-kfsnl\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622306 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ced5d66d-39df-4267-b801-e1e60d517ace-proxy-tls\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622352 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622383 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s42gc\" (UniqueName: \"kubernetes.io/projected/72a13a81-4c11-4529-8a3d-2dd3c73215a7-kube-api-access-s42gc\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622416 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-conf-dir\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622803 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-etc-kubernetes\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622865 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7c9fd1f0-58d6-4986-86b5-8c26c871e79b-serviceca\") pod \"node-ca-jhp6z\" (UID: \"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\") " pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.623955 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-cnibin\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624008 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-run-netns\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624421 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624479 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624522 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624560 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624599 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-slash\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624630 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-ovn\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624668 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624703 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624746 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624783 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624821 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624853 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-daemon-config\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624934 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-run-k8s-cni-cncf-io\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624974 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-kubelet\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625000 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-os-release\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625023 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-os-release\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625045 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-run-multus-certs\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625070 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625093 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5gdt\" (UniqueName: \"kubernetes.io/projected/f4c125b3-4a9c-46a7-a468-54e93c44751d-kube-api-access-m5gdt\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625136 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/194b252b-4eca-42f4-85e1-5c51a42eb407-hosts-file\") pod \"node-resolver-24l69\" (UID: \"194b252b-4eca-42f4-85e1-5c51a42eb407\") " pod="openshift-dns/node-resolver-24l69" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625158 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-systemd-units\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625181 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-openvswitch\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625203 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-config\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625224 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-script-lib\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625271 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-system-cni-dir\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625294 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625315 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c9fd1f0-58d6-4986-86b5-8c26c871e79b-host\") pod \"node-ca-jhp6z\" (UID: \"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\") " pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625336 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-var-lib-kubelet\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625358 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-systemd\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625378 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-cnibin\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625399 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-cni-binary-copy\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625420 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-system-cni-dir\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625441 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-netns\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625466 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625490 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrsmg\" (UniqueName: \"kubernetes.io/projected/194b252b-4eca-42f4-85e1-5c51a42eb407-kube-api-access-hrsmg\") pod \"node-resolver-24l69\" (UID: \"194b252b-4eca-42f4-85e1-5c51a42eb407\") " pod="openshift-dns/node-resolver-24l69" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625511 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ced5d66d-39df-4267-b801-e1e60d517ace-rootfs\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625807 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-var-lib-openvswitch\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625848 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6dac87a5-07eb-488d-85fe-cb8848434ae5-cni-binary-copy\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625904 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-hostroot\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625933 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4lnm\" (UniqueName: \"kubernetes.io/projected/6dac87a5-07eb-488d-85fe-cb8848434ae5-kube-api-access-k4lnm\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626058 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-node-log\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626259 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626471 4994 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626499 4994 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626519 4994 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626574 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626599 4994 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.628631 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.629566 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619270 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619454 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619675 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.619642 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620063 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620227 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620299 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.620934 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621736 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.621847 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622666 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.622705 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.623063 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.623405 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.623736 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.623760 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.630071 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.630221 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.630306 4994 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.630360 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.630446 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.630495 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.631045 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.631122 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.631188 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.631378 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.632320 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.632349 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.632344 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.634176 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.623845 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.623928 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.623940 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624059 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624085 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624336 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624473 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624514 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624530 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624555 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624601 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.624651 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625222 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625248 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625517 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.625660 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626182 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626246 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.626533 4994 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.626841 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.627073 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.627776 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.628257 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.628557 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.628563 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.640406 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.628576 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.627849 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.628682 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.629052 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.629104 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.629125 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.629580 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.634582 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.634795 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.635061 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.635157 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.635612 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.635857 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.636409 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.637259 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.637280 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638015 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638016 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638087 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638246 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638301 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638575 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638599 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638700 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638802 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.638864 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639030 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639088 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639535 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639560 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639594 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639695 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639804 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639952 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639979 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639971 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.640049 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.640129 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.640211 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.640607 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.640691 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.641051 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.641088 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.641221 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.641465 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.641505 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.641171 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.623779 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.639410 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.644974 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.645073 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.645524 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.645695 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.645844 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.645944 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.646274 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.646327 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.646458 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.646552 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.646580 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.647012 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:49.146989137 +0000 UTC m=+83.320695896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.647180 4994 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.647308 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:49.147292217 +0000 UTC m=+83.320998966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648292 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648385 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648455 4994 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648516 4994 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648574 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648634 4994 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648692 4994 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648749 4994 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648807 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648861 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.648952 4994 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649008 4994 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649060 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649115 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649178 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649233 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649292 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649345 4994 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649400 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649457 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649508 4994 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649570 4994 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649631 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649688 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.649744 4994 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.650567 4994 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.650616 4994 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.650639 4994 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.650666 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.650703 4994 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.650839 4994 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.650966 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651006 4994 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651025 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651042 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651060 4994 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651084 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651102 4994 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651213 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651293 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651327 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651345 4994 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651362 4994 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651383 4994 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651482 4994 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651501 4994 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651519 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651543 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651558 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651571 4994 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651584 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651600 4994 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651612 4994 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651624 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651650 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651663 4994 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651676 4994 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651689 4994 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.651705 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.655111 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.655423 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.655439 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.655746 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.655774 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.656179 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.657431 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.658037 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.659888 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.660169 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.661284 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.661473 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.661580 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.662062 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.664101 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.664579 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.664740 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.664860 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.664926 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.664943 4994 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.665036 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:49.16501547 +0000 UTC m=+83.338722239 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.665221 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.670715 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.670761 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.670943 4994 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.670717 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.671037 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.671058 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:49.171020835 +0000 UTC m=+83.344727624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.671060 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.671116 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.671141 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.674673 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.674737 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.675505 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.684511 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.685257 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.685646 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.694587 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.696029 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.705652 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.716272 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.728843 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.737232 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.746823 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753320 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/194b252b-4eca-42f4-85e1-5c51a42eb407-hosts-file\") pod \"node-resolver-24l69\" (UID: \"194b252b-4eca-42f4-85e1-5c51a42eb407\") " pod="openshift-dns/node-resolver-24l69" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753428 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-systemd-units\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753459 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-openvswitch\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753482 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-config\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753527 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-script-lib\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753554 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-system-cni-dir\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753576 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c9fd1f0-58d6-4986-86b5-8c26c871e79b-host\") pod \"node-ca-jhp6z\" (UID: \"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\") " pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753597 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-var-lib-kubelet\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753607 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-systemd-units\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753727 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/194b252b-4eca-42f4-85e1-5c51a42eb407-hosts-file\") pod \"node-resolver-24l69\" (UID: \"194b252b-4eca-42f4-85e1-5c51a42eb407\") " pod="openshift-dns/node-resolver-24l69" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753689 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753764 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-cnibin\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753783 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-cni-binary-copy\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753800 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-system-cni-dir\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753815 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-netns\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753831 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-systemd\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753848 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrsmg\" (UniqueName: \"kubernetes.io/projected/194b252b-4eca-42f4-85e1-5c51a42eb407-kube-api-access-hrsmg\") pod \"node-resolver-24l69\" (UID: \"194b252b-4eca-42f4-85e1-5c51a42eb407\") " pod="openshift-dns/node-resolver-24l69" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753885 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ced5d66d-39df-4267-b801-e1e60d517ace-rootfs\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753902 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-var-lib-openvswitch\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753918 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6dac87a5-07eb-488d-85fe-cb8848434ae5-cni-binary-copy\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753936 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4lnm\" (UniqueName: \"kubernetes.io/projected/6dac87a5-07eb-488d-85fe-cb8848434ae5-kube-api-access-k4lnm\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753952 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-node-log\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753969 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-hostroot\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753996 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-bin\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754014 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-netd\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754030 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovn-node-metrics-cert\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754045 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-cni-dir\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754063 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754081 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754100 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-var-lib-cni-bin\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754085 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c9fd1f0-58d6-4986-86b5-8c26c871e79b-host\") pod \"node-ca-jhp6z\" (UID: \"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\") " pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754126 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-socket-dir-parent\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754161 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-cnibin\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.753766 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-openvswitch\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754187 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-system-cni-dir\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754227 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-env-overrides\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754340 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754504 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ced5d66d-39df-4267-b801-e1e60d517ace-rootfs\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754521 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-socket-dir-parent\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754553 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-var-lib-openvswitch\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754599 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-bin\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754605 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-netns\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754638 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-systemd\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754655 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-var-lib-kubelet\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754649 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754695 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-cni-binary-copy\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754728 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754735 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754745 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-netd\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754813 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-node-log\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754737 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-etc-openvswitch\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754942 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-system-cni-dir\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754972 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754992 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-var-lib-cni-bin\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.754998 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755051 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-cni-dir\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755090 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwkj5\" (UniqueName: \"kubernetes.io/projected/7c9fd1f0-58d6-4986-86b5-8c26c871e79b-kube-api-access-hwkj5\") pod \"node-ca-jhp6z\" (UID: \"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\") " pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755121 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-config\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.755136 4994 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755174 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-script-lib\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755183 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-hostroot\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755183 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-etc-openvswitch\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755219 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755296 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r5sl\" (UniqueName: \"kubernetes.io/projected/ced5d66d-39df-4267-b801-e1e60d517ace-kube-api-access-9r5sl\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755317 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-log-socket\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: E0310 00:07:48.755366 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs podName:f4c125b3-4a9c-46a7-a468-54e93c44751d nodeName:}" failed. No retries permitted until 2026-03-10 00:07:49.255342935 +0000 UTC m=+83.429049694 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs") pod "network-metrics-daemon-vxjt2" (UID: "f4c125b3-4a9c-46a7-a468-54e93c44751d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755412 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-env-overrides\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755443 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dkqt\" (UniqueName: \"kubernetes.io/projected/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-kube-api-access-7dkqt\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755523 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-log-socket\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755605 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-var-lib-cni-multus\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755634 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfsnl\" (UniqueName: \"kubernetes.io/projected/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-kube-api-access-kfsnl\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755707 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ced5d66d-39df-4267-b801-e1e60d517ace-proxy-tls\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755850 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-var-lib-cni-multus\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.755930 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ced5d66d-39df-4267-b801-e1e60d517ace-mcd-auth-proxy-config\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.756471 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ced5d66d-39df-4267-b801-e1e60d517ace-mcd-auth-proxy-config\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.757665 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-env-overrides\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.757734 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758009 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s42gc\" (UniqueName: \"kubernetes.io/projected/72a13a81-4c11-4529-8a3d-2dd3c73215a7-kube-api-access-s42gc\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758101 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-conf-dir\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758224 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-etc-kubernetes\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758418 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7c9fd1f0-58d6-4986-86b5-8c26c871e79b-serviceca\") pod \"node-ca-jhp6z\" (UID: \"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\") " pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758456 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758477 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-env-overrides\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758490 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-run-netns\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758508 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-etc-kubernetes\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758573 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-cnibin\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758605 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-ovn\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758664 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758678 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-conf-dir\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758715 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-slash\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758755 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-daemon-config\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758808 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758927 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-run-k8s-cni-cncf-io\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.758981 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-os-release\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759013 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-os-release\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759045 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-run-multus-certs\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759078 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5gdt\" (UniqueName: \"kubernetes.io/projected/f4c125b3-4a9c-46a7-a468-54e93c44751d-kube-api-access-m5gdt\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759114 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-kubelet\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759165 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759191 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-run-netns\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759207 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759224 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-ovn\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759246 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-cnibin\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759293 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-os-release\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759496 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6dac87a5-07eb-488d-85fe-cb8848434ae5-cni-binary-copy\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.759554 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-slash\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760349 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760373 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-run-k8s-cni-cncf-io\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760406 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-kubelet\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760422 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-host-run-multus-certs\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760473 4994 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760485 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760494 4994 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760505 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760517 4994 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760526 4994 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760536 4994 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760545 4994 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760555 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760565 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760576 4994 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760585 4994 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760595 4994 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760604 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760614 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760624 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760633 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760641 4994 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760650 4994 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760660 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760669 4994 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760677 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760686 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760694 4994 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760703 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760711 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760720 4994 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760730 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760739 4994 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760747 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760756 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760765 4994 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760774 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760781 4994 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760791 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760799 4994 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760807 4994 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760815 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760824 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760832 4994 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760840 4994 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760849 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760858 4994 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760884 4994 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760893 4994 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760902 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760910 4994 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760918 4994 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760927 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760936 4994 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760945 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760953 4994 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760962 4994 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760972 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760981 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.760991 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761001 4994 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761010 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761020 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761030 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761038 4994 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761047 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761055 4994 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761064 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761072 4994 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761080 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761089 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761099 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761108 4994 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761117 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761126 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761134 4994 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761142 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761151 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761160 4994 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761169 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761178 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761187 4994 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761196 4994 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761204 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761214 4994 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761223 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761231 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761239 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761248 4994 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761256 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761265 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761273 4994 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761282 4994 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761291 4994 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761300 4994 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761309 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761318 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761326 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761335 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761343 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761351 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761360 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761370 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761379 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761386 4994 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761395 4994 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761404 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761412 4994 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761420 4994 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761428 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761437 4994 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761445 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761454 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761462 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761470 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761479 4994 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761488 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761497 4994 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761505 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761513 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761521 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761533 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761541 4994 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761549 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761557 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761565 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761573 4994 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761582 4994 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761590 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761597 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761606 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761614 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.761623 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.762383 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6dac87a5-07eb-488d-85fe-cb8848434ae5-multus-daemon-config\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.762735 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ced5d66d-39df-4267-b801-e1e60d517ace-proxy-tls\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.763746 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7c9fd1f0-58d6-4986-86b5-8c26c871e79b-serviceca\") pod \"node-ca-jhp6z\" (UID: \"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\") " pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.766596 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovn-node-metrics-cert\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.771298 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.771417 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6dac87a5-07eb-488d-85fe-cb8848434ae5-os-release\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.780247 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.782722 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwkj5\" (UniqueName: \"kubernetes.io/projected/7c9fd1f0-58d6-4986-86b5-8c26c871e79b-kube-api-access-hwkj5\") pod \"node-ca-jhp6z\" (UID: \"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\") " pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.784448 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfsnl\" (UniqueName: \"kubernetes.io/projected/cd1d8032-7c65-474f-9a19-a93bf0cac8ba-kube-api-access-kfsnl\") pod \"ovnkube-control-plane-749d76644c-d28jn\" (UID: \"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.785694 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4lnm\" (UniqueName: \"kubernetes.io/projected/6dac87a5-07eb-488d-85fe-cb8848434ae5-kube-api-access-k4lnm\") pod \"multus-mcxcb\" (UID: \"6dac87a5-07eb-488d-85fe-cb8848434ae5\") " pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.785975 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r5sl\" (UniqueName: \"kubernetes.io/projected/ced5d66d-39df-4267-b801-e1e60d517ace-kube-api-access-9r5sl\") pod \"machine-config-daemon-kfljj\" (UID: \"ced5d66d-39df-4267-b801-e1e60d517ace\") " pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.787118 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.787143 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.787151 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.787165 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.787174 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.790053 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dkqt\" (UniqueName: \"kubernetes.io/projected/2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08-kube-api-access-7dkqt\") pod \"multus-additional-cni-plugins-b2f6h\" (UID: \"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\") " pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.790071 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrsmg\" (UniqueName: \"kubernetes.io/projected/194b252b-4eca-42f4-85e1-5c51a42eb407-kube-api-access-hrsmg\") pod \"node-resolver-24l69\" (UID: \"194b252b-4eca-42f4-85e1-5c51a42eb407\") " pod="openshift-dns/node-resolver-24l69" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.791356 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5gdt\" (UniqueName: \"kubernetes.io/projected/f4c125b3-4a9c-46a7-a468-54e93c44751d-kube-api-access-m5gdt\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.791976 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s42gc\" (UniqueName: \"kubernetes.io/projected/72a13a81-4c11-4529-8a3d-2dd3c73215a7-kube-api-access-s42gc\") pod \"ovnkube-node-ns797\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.795274 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.851238 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.860856 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.870942 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:48 crc kubenswrapper[4994]: W0310 00:07:48.872097 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-e966cda8e8654747d75214e22bb99cbbf074fa8600f3d57f77ebee1a12ed4778 WatchSource:0}: Error finding container e966cda8e8654747d75214e22bb99cbbf074fa8600f3d57f77ebee1a12ed4778: Status 404 returned error can't find the container with id e966cda8e8654747d75214e22bb99cbbf074fa8600f3d57f77ebee1a12ed4778 Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.880977 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-24l69" Mar 10 00:07:48 crc kubenswrapper[4994]: W0310 00:07:48.882058 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-9887da5925f5a1e634859afec8aa0e2b7ceedea60ce74d5b0fdf9e04e500d942 WatchSource:0}: Error finding container 9887da5925f5a1e634859afec8aa0e2b7ceedea60ce74d5b0fdf9e04e500d942: Status 404 returned error can't find the container with id 9887da5925f5a1e634859afec8aa0e2b7ceedea60ce74d5b0fdf9e04e500d942 Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.889665 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.889714 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.889732 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.889756 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.889774 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.891520 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:07:48 crc kubenswrapper[4994]: W0310 00:07:48.892612 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-ffb01b4eb5e35ecfb3ea97a0de5f6095a52ba4b2ae1914794e1a1e28f636cc35 WatchSource:0}: Error finding container ffb01b4eb5e35ecfb3ea97a0de5f6095a52ba4b2ae1914794e1a1e28f636cc35: Status 404 returned error can't find the container with id ffb01b4eb5e35ecfb3ea97a0de5f6095a52ba4b2ae1914794e1a1e28f636cc35 Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.902361 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ffb01b4eb5e35ecfb3ea97a0de5f6095a52ba4b2ae1914794e1a1e28f636cc35"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.903624 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.904183 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9887da5925f5a1e634859afec8aa0e2b7ceedea60ce74d5b0fdf9e04e500d942"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.905770 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e966cda8e8654747d75214e22bb99cbbf074fa8600f3d57f77ebee1a12ed4778"} Mar 10 00:07:48 crc kubenswrapper[4994]: W0310 00:07:48.921044 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194b252b_4eca_42f4_85e1_5c51a42eb407.slice/crio-5e381bb4e65c0584431ab12bdf67696869975d892e7a898c77054c1c289aa596 WatchSource:0}: Error finding container 5e381bb4e65c0584431ab12bdf67696869975d892e7a898c77054c1c289aa596: Status 404 returned error can't find the container with id 5e381bb4e65c0584431ab12bdf67696869975d892e7a898c77054c1c289aa596 Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.926783 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" Mar 10 00:07:48 crc kubenswrapper[4994]: W0310 00:07:48.944780 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72a13a81_4c11_4529_8a3d_2dd3c73215a7.slice/crio-b8da989a4363394b0ce6c6c658409a1fc0f3d0c82d2ec6c0704e2ab145277cf7 WatchSource:0}: Error finding container b8da989a4363394b0ce6c6c658409a1fc0f3d0c82d2ec6c0704e2ab145277cf7: Status 404 returned error can't find the container with id b8da989a4363394b0ce6c6c658409a1fc0f3d0c82d2ec6c0704e2ab145277cf7 Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.982370 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mcxcb" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.991955 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.991988 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.991998 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.992014 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.992026 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4994]: I0310 00:07:48.993809 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.001602 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jhp6z" Mar 10 00:07:49 crc kubenswrapper[4994]: W0310 00:07:49.010317 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dac87a5_07eb_488d_85fe_cb8848434ae5.slice/crio-7810b3b0c224bd62f286711d7bde48e3d72f3ed4791704b05bc78daa14b5ff2c WatchSource:0}: Error finding container 7810b3b0c224bd62f286711d7bde48e3d72f3ed4791704b05bc78daa14b5ff2c: Status 404 returned error can't find the container with id 7810b3b0c224bd62f286711d7bde48e3d72f3ed4791704b05bc78daa14b5ff2c Mar 10 00:07:49 crc kubenswrapper[4994]: W0310 00:07:49.052565 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c9fd1f0_58d6_4986_86b5_8c26c871e79b.slice/crio-607b6977609ad51e465ced1835f89caa686f47e8d1b53335c04f3b1e0107a67b WatchSource:0}: Error finding container 607b6977609ad51e465ced1835f89caa686f47e8d1b53335c04f3b1e0107a67b: Status 404 returned error can't find the container with id 607b6977609ad51e465ced1835f89caa686f47e8d1b53335c04f3b1e0107a67b Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.095085 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.095129 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.095140 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.095159 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.095172 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.167294 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.167419 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.167448 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.167498 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:07:50.167461682 +0000 UTC m=+84.341168431 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.167536 4994 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.167591 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:50.167577826 +0000 UTC m=+84.341284575 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.167618 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.167649 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.167660 4994 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.167661 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.167712 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:50.16769624 +0000 UTC m=+84.341402989 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.167784 4994 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.167839 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:50.167830104 +0000 UTC m=+84.341536943 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.199299 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.199329 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.199339 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.199353 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.199365 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.268130 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.268362 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.268380 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.268391 4994 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.268914 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.269224 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:50.269209204 +0000 UTC m=+84.442915953 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.269433 4994 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: E0310 00:07:49.269465 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs podName:f4c125b3-4a9c-46a7-a468-54e93c44751d nodeName:}" failed. No retries permitted until 2026-03-10 00:07:50.269456793 +0000 UTC m=+84.443163542 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs") pod "network-metrics-daemon-vxjt2" (UID: "f4c125b3-4a9c-46a7-a468-54e93c44751d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.301976 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.302385 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.302396 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.302413 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.302422 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.404346 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.404381 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.404394 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.404418 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.404428 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.507348 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.507386 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.507395 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.507410 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.507420 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.611591 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.611650 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.611668 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.611691 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.611708 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.715300 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.715348 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.715359 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.715375 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.715389 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.773619 4994 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.818413 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.818459 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.818470 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.818487 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.818500 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.913778 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jhp6z" event={"ID":"7c9fd1f0-58d6-4986-86b5-8c26c871e79b","Type":"ContainerStarted","Data":"61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.913829 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jhp6z" event={"ID":"7c9fd1f0-58d6-4986-86b5-8c26c871e79b","Type":"ContainerStarted","Data":"607b6977609ad51e465ced1835f89caa686f47e8d1b53335c04f3b1e0107a67b"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.917516 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" event={"ID":"cd1d8032-7c65-474f-9a19-a93bf0cac8ba","Type":"ContainerStarted","Data":"64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.917548 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" event={"ID":"cd1d8032-7c65-474f-9a19-a93bf0cac8ba","Type":"ContainerStarted","Data":"049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.917564 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" event={"ID":"cd1d8032-7c65-474f-9a19-a93bf0cac8ba","Type":"ContainerStarted","Data":"72de72f0dc1d88f178eb20671aedb9c97f4717c5ab9c8ccae29de4193ac08349"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.920043 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6" exitCode=0 Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.920132 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.920184 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"b8da989a4363394b0ce6c6c658409a1fc0f3d0c82d2ec6c0704e2ab145277cf7"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.922802 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.922863 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.922924 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.922953 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.922976 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.924565 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.928077 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.928141 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.928165 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"3b13cdfd69ef4b401eb74eaace11f32c9b70f675cf90d3ad73f8b7acf3371165"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.931044 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.931129 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.932865 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mcxcb" event={"ID":"6dac87a5-07eb-488d-85fe-cb8848434ae5","Type":"ContainerStarted","Data":"5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.932915 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mcxcb" event={"ID":"6dac87a5-07eb-488d-85fe-cb8848434ae5","Type":"ContainerStarted","Data":"7810b3b0c224bd62f286711d7bde48e3d72f3ed4791704b05bc78daa14b5ff2c"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.935395 4994 generic.go:334] "Generic (PLEG): container finished" podID="2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08" containerID="23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3" exitCode=0 Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.935490 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" event={"ID":"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08","Type":"ContainerDied","Data":"23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.935589 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" event={"ID":"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08","Type":"ContainerStarted","Data":"5dbf18b53d3f7ca2f039c6e87cadd8f5dd12f1d848f94884c1284843cc640226"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.937490 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-24l69" event={"ID":"194b252b-4eca-42f4-85e1-5c51a42eb407","Type":"ContainerStarted","Data":"d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.937530 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-24l69" event={"ID":"194b252b-4eca-42f4-85e1-5c51a42eb407","Type":"ContainerStarted","Data":"5e381bb4e65c0584431ab12bdf67696869975d892e7a898c77054c1c289aa596"} Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.939022 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.962420 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.979035 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:49 crc kubenswrapper[4994]: I0310 00:07:49.998266 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.021403 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.025563 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.025596 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.025609 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.025626 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.025638 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.047619 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.062826 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.075171 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.091227 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.108952 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.125420 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.128182 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.128224 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.128235 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.128253 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.128264 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.140310 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.157865 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.177837 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.180255 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.180351 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.180381 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.180401 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.180490 4994 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.180538 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:52.180526688 +0000 UTC m=+86.354233437 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.180593 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.180604 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.180613 4994 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.180634 4994 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.180655 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:07:52.180625621 +0000 UTC m=+86.354332400 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.180693 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:52.180680263 +0000 UTC m=+86.354387052 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.180715 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:52.180704004 +0000 UTC m=+86.354410793 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.200096 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.226802 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.229942 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.229966 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.229974 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.229987 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.229995 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.247179 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.260537 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.273315 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.281002 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.281066 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.281167 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.281191 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.281201 4994 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.281207 4994 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.281250 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:52.281234045 +0000 UTC m=+86.454940794 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.281269 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs podName:f4c125b3-4a9c-46a7-a468-54e93c44751d nodeName:}" failed. No retries permitted until 2026-03-10 00:07:52.281261716 +0000 UTC m=+86.454968465 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs") pod "network-metrics-daemon-vxjt2" (UID: "f4c125b3-4a9c-46a7-a468-54e93c44751d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.281669 4994 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.289595 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.302719 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.319352 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.332774 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.332817 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.332833 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.332856 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.332896 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.334008 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.347713 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.361410 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.379195 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.388958 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.400419 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.435744 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.435773 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.435782 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.435796 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.435807 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.538558 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.538618 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.538636 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.538661 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.538683 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.553887 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.553905 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.553941 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.553989 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.554078 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.554164 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.554172 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:50 crc kubenswrapper[4994]: E0310 00:07:50.554356 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.558689 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.559578 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.560999 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.561902 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.563004 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.563603 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.564290 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.565344 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.566073 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.567102 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.567705 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.570346 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.571137 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.571772 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.572454 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.573009 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.573896 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.574563 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.575449 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.576311 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.577016 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.577752 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.578361 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.581287 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.581837 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.582434 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.583058 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.583584 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.585034 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.585510 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.585957 4994 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.586052 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.587308 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.587812 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.589238 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.590695 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.591379 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.592270 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.592910 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.593909 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.594374 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.595432 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.596234 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.597153 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.597583 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.598414 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.599210 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.600362 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.600842 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.601622 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.602074 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.602958 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.603718 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.604185 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.641518 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.641585 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.641604 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.641628 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.641647 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.744338 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.744842 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.744917 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.744955 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.744977 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.847675 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.847711 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.847720 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.847734 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.847744 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.944278 4994 generic.go:334] "Generic (PLEG): container finished" podID="2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08" containerID="a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3" exitCode=0 Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.944351 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" event={"ID":"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08","Type":"ContainerDied","Data":"a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.952157 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.952397 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.952488 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.952571 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.952652 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.960170 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.960336 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.960937 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.961102 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.971107 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:50 crc kubenswrapper[4994]: I0310 00:07:50.989035 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.007964 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.022278 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.037523 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.057009 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.057048 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.057060 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.057086 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.057099 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.058703 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.121356 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.143472 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.162689 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.163002 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.163014 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.163032 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.163045 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.164414 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.197724 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.212343 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.223949 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.237077 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.252312 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.265248 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.265284 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.265293 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.265307 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.265319 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.367655 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.367688 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.367699 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.367714 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.367725 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.470098 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.470153 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.470171 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.470197 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.470218 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.568809 4994 scope.go:117] "RemoveContainer" containerID="4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4" Mar 10 00:07:51 crc kubenswrapper[4994]: E0310 00:07:51.569159 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.570145 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.573137 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.573191 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.573211 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.573233 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.573255 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.677455 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.677521 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.677542 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.677567 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.677589 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.780352 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.780409 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.780422 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.780443 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.780457 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.884426 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.884486 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.884503 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.884527 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.884546 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.967738 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" event={"ID":"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08","Type":"ContainerDied","Data":"a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.967725 4994 generic.go:334] "Generic (PLEG): container finished" podID="2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08" containerID="a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877" exitCode=0 Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.977828 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.977962 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.978608 4994 scope.go:117] "RemoveContainer" containerID="4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4" Mar 10 00:07:51 crc kubenswrapper[4994]: E0310 00:07:51.978904 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.987389 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.987464 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.987486 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.987514 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.987536 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4994]: I0310 00:07:51.997107 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.015805 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.032504 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.053651 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.070074 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.089437 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.090399 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.090440 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.090453 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.090473 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.090485 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.115755 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.136050 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.149638 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.163287 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.176988 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.193778 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.194449 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.194503 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.194521 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.194542 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.194555 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.197081 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.197272 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:07:56.197235211 +0000 UTC m=+90.370941990 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.197379 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.197450 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.197493 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.197567 4994 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.197568 4994 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.197624 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:56.197607174 +0000 UTC m=+90.371313923 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.197669 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:56.197650645 +0000 UTC m=+90.371357404 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.197716 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.197739 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.197759 4994 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.197812 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:56.19779727 +0000 UTC m=+90.371504049 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.211064 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.234400 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.253284 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.297154 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.297203 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.297217 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.297235 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.297247 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.303252 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.303417 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.303463 4994 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.303543 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs podName:f4c125b3-4a9c-46a7-a468-54e93c44751d nodeName:}" failed. No retries permitted until 2026-03-10 00:07:56.303522678 +0000 UTC m=+90.477229447 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs") pod "network-metrics-daemon-vxjt2" (UID: "f4c125b3-4a9c-46a7-a468-54e93c44751d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.303617 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.303645 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.303664 4994 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.303735 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:56.303710175 +0000 UTC m=+90.477416964 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.400299 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.400358 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.400378 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.400404 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.400423 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.505820 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.505919 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.505937 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.505966 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.505990 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.553829 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.553897 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.553960 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.553969 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.554038 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.554203 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.554258 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:07:52 crc kubenswrapper[4994]: E0310 00:07:52.554389 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.608543 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.609562 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.609591 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.609613 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.609625 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.713487 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.713573 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.713592 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.713623 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.713647 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.816143 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.816209 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.816231 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.816263 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.816286 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.919186 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.919237 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.919254 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.919278 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.919295 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.983822 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d"} Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.988245 4994 generic.go:334] "Generic (PLEG): container finished" podID="2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08" containerID="ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58" exitCode=0 Mar 10 00:07:52 crc kubenswrapper[4994]: I0310 00:07:52.988296 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" event={"ID":"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08","Type":"ContainerDied","Data":"ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.012304 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.022644 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.022772 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.022799 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.022828 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.024220 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.037364 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.060981 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.078807 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.098600 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.121122 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.127174 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.127231 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.127251 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.127278 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.127299 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.153368 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.169194 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.182962 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.198452 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.217962 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.232582 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.232613 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.232623 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.232638 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.232649 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.235111 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.260662 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.275167 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.291228 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.310435 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.327055 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.336051 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.336105 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.336128 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.336159 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.336182 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.348343 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.375297 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.391855 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.410798 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.425645 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.439567 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.439598 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.439611 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.439626 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.439638 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.441018 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.454512 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.470621 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.478811 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.499835 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.510528 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.520216 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.530717 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.542129 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.542318 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.542395 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.542470 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.542547 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.645693 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.645736 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.645744 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.645757 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.645765 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.748347 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.750098 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.750843 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.751866 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.752047 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.855086 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.855147 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.855165 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.855194 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.855212 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.958638 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.958703 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.958721 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.958747 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.958765 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.994858 4994 generic.go:334] "Generic (PLEG): container finished" podID="2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08" containerID="59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1" exitCode=0 Mar 10 00:07:53 crc kubenswrapper[4994]: I0310 00:07:53.995229 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" event={"ID":"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08","Type":"ContainerDied","Data":"59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.012468 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.013922 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.028356 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.040916 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.060531 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.064704 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.064971 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.065140 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.065427 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.065602 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.073537 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.089785 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.123885 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.134107 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.144736 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.159148 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.168743 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.168783 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.168797 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.168813 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.168824 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.171274 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.180673 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.196327 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.215636 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.227083 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.271346 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.271388 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.271399 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.271416 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.271429 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.373215 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.373568 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.373750 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.373984 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.374198 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.477552 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.477944 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.478193 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.478399 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.478587 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.553334 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.553410 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.553561 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:54 crc kubenswrapper[4994]: E0310 00:07:54.554228 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:54 crc kubenswrapper[4994]: E0310 00:07:54.554372 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:54 crc kubenswrapper[4994]: E0310 00:07:54.554412 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.554859 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:54 crc kubenswrapper[4994]: E0310 00:07:54.555246 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.582510 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.582569 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.582588 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.582610 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.582631 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.685706 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.685773 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.685793 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.685820 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.685839 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.789266 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.789323 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.789341 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.789367 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.789386 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.892432 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.892502 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.892527 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.892558 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.892580 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.996330 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.996673 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.998711 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.998860 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4994]: I0310 00:07:54.999028 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.024190 4994 generic.go:334] "Generic (PLEG): container finished" podID="2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08" containerID="74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9" exitCode=0 Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.024261 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" event={"ID":"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08","Type":"ContainerDied","Data":"74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.048592 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.066237 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.090480 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.105540 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.105620 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.105645 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.105677 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.105700 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.106854 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.123430 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.142994 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.173909 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.188394 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.204425 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.209167 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.209212 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.209231 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.209255 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.209273 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.222760 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.238739 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.253413 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.271809 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.286700 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.305858 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.311340 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.311360 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.311370 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.311383 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.311392 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.413313 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.413379 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.413397 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.413422 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.413441 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.516645 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.517051 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.517069 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.517095 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.517110 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.566367 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.621093 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.621126 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.621137 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.621155 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.621167 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.723544 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.723592 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.723606 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.723623 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.723634 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.825779 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.825813 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.825823 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.825838 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.825849 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.927940 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.927965 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.927973 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.927986 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4994]: I0310 00:07:55.927994 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.030525 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.030562 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.030573 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.030591 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.030602 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.033799 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" event={"ID":"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08","Type":"ContainerStarted","Data":"244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.040423 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.047544 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.058212 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.069069 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.080612 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.089701 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.112075 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.133529 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.133579 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.133593 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.133613 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.133637 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.139326 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.161961 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.183808 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.201657 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.215484 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.224812 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.234041 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.235671 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.235762 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.235943 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.236104 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.236197 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.245520 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.256231 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.260596 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.260745 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:08:04.260722411 +0000 UTC m=+98.434429170 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.260919 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.261052 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.261158 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.261002 4994 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.261368 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:04.261351953 +0000 UTC m=+98.435058702 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.261192 4994 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.261553 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:04.261544739 +0000 UTC m=+98.435251488 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.261261 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.261748 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.261808 4994 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.261903 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:04.261895281 +0000 UTC m=+98.435602030 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.313634 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.327661 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.338179 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.338204 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.338213 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.338225 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.338234 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.346579 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.356594 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.362560 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.362600 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.362701 4994 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.362743 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs podName:f4c125b3-4a9c-46a7-a468-54e93c44751d nodeName:}" failed. No retries permitted until 2026-03-10 00:08:04.362731112 +0000 UTC m=+98.536437851 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs") pod "network-metrics-daemon-vxjt2" (UID: "f4c125b3-4a9c-46a7-a468-54e93c44751d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.362798 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.362813 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.362822 4994 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.362845 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:04.362837136 +0000 UTC m=+98.536543875 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.370548 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.383917 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.396363 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.408290 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.419962 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.434085 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.440619 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.440647 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.440656 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.440671 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.440682 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.446823 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.461543 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.477155 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.486848 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.505429 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.516842 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.529065 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.542565 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.542750 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.543383 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.543438 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.543485 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.553338 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.553444 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.553662 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.553715 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.553663 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.553775 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.553630 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.553827 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.565785 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.577658 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.590964 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.599691 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.608503 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.619652 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.629462 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.646986 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.647029 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.647059 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.647075 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.647146 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.650062 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.663353 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.683485 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.695259 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.712865 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.722727 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.734360 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.746181 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.750004 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.750054 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.750066 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.750086 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.750098 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.758525 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.852406 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.852465 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.852484 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.852510 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.852527 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.868704 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.868749 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.868766 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.868789 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.868809 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.888062 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.892497 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.892550 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.892563 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.892582 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.892595 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.909862 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.913652 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.913692 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.913704 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.913720 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.913732 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.934193 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.937817 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.937844 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.937856 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.937892 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.937906 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.948059 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.951606 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.951639 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.951650 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.951665 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.951676 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.961483 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:56 crc kubenswrapper[4994]: E0310 00:07:56.961643 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.963764 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.963797 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.963810 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.963825 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4994]: I0310 00:07:56.963836 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.043750 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.043843 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.043865 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.066288 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.066318 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.066327 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.066341 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.066351 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.070676 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.073815 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.081980 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.092294 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.103907 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.119735 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.128793 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.139331 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.150120 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.159811 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.168542 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.168603 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.168621 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.168645 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.168662 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.172745 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.191005 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.202718 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.219645 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.233604 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.252674 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.265802 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.273890 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.273927 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.273936 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.273951 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.273960 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.288940 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.300700 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.318538 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.332815 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.351943 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.370536 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.375773 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.375903 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.375979 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.376062 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.376136 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.383565 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.397200 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.419484 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.436007 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.456636 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.477610 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.478774 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.478854 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.478887 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.478917 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.478934 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.492264 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.501698 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.515285 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.526690 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.542928 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.581671 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.581708 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.581721 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.581739 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.581754 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.684033 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.684073 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.684085 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.684103 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.684114 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.787397 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.787464 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.787483 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.787509 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.787527 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.891556 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.891607 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.891624 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.891650 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.891669 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.995180 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.995422 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.995524 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.995604 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4994]: I0310 00:07:57.995683 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.099814 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.099914 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.099934 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.099965 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.099983 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.204474 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.204542 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.204560 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.204584 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.204601 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.307932 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.307975 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.307986 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.308005 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.308017 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.411482 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.411546 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.411563 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.411589 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.411606 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.514779 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.514855 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.514911 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.514938 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.514956 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.553709 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.553749 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.553951 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.553952 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:58 crc kubenswrapper[4994]: E0310 00:07:58.554373 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:07:58 crc kubenswrapper[4994]: E0310 00:07:58.554392 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:58 crc kubenswrapper[4994]: E0310 00:07:58.554483 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:58 crc kubenswrapper[4994]: E0310 00:07:58.554570 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.618398 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.618484 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.618511 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.618544 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.618570 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.721541 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.721579 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.721589 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.721603 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.721613 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.824421 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.824493 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.824512 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.824537 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.824554 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.928316 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.928413 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.928488 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.928517 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4994]: I0310 00:07:58.928536 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.031421 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.031507 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.031534 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.031563 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.031580 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.051093 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/0.log" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.055577 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990" exitCode=1 Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.055638 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.056730 4994 scope.go:117] "RemoveContainer" containerID="a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.075802 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.104246 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:07:58Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI0310 00:07:58.268414 6831 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:07:58.268663 6831 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:07:58.268788 6831 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:07:58.269025 6831 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:07:58.269124 6831 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:07:58.269588 6831 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:07:58.269631 6831 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:07:58.269709 6831 factory.go:656] Stopping watch factory\\\\nI0310 00:07:58.269736 6831 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:07:58.269795 6831 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:07:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.124478 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.136081 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.137821 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.137900 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.137914 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.137932 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.137945 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.149864 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.166747 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.181536 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.197267 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.218688 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.236026 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.240490 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.240522 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.240535 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.240553 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.240566 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.258190 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.272659 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.284969 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.298691 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.310454 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.321683 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.343031 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.343092 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.343112 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.343137 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.343155 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.446242 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.446299 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.446316 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.446343 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.446360 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.548921 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.548977 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.548986 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.549002 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.549010 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.578447 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.651373 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.651412 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.651421 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.651435 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.651444 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.753993 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.754029 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.754041 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.754056 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.754065 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.857601 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.857662 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.857681 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.857704 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.857721 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.960831 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.960868 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.960889 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.960903 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4994]: I0310 00:07:59.960912 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.061947 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/1.log" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.062695 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/0.log" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.062754 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.063012 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.063132 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.063227 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.063313 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.066301 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c" exitCode=1 Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.066422 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.066490 4994 scope.go:117] "RemoveContainer" containerID="a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.067386 4994 scope.go:117] "RemoveContainer" containerID="eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c" Mar 10 00:08:00 crc kubenswrapper[4994]: E0310 00:08:00.067654 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.080297 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.099925 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a57cc93346842ba7a474ff006ae6d278bf81197d7ed6f3b03ccda0000db8b990\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:07:58Z\\\",\\\"message\\\":\\\".io/client-go/informers/factory.go:160\\\\nI0310 00:07:58.268414 6831 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:07:58.268663 6831 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:07:58.268788 6831 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:07:58.269025 6831 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:07:58.269124 6831 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:07:58.269588 6831 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:07:58.269631 6831 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:07:58.269709 6831 factory.go:656] Stopping watch factory\\\\nI0310 00:07:58.269736 6831 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:07:58.269795 6831 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:07:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"1.Namespace event handler 5 for removal\\\\nI0310 00:08:00.021673 6969 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:00.021681 6969 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:00.021702 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:00.021724 6969 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 00:08:00.021726 6969 factory.go:656] Stopping watch factory\\\\nI0310 00:08:00.021742 6969 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:00.021744 6969 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:00.021750 6969 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:00.021765 6969 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:00.021780 6969 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:00.021822 6969 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:00.022194 6969 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:00.022276 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.110553 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.125977 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.140286 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.155386 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.166263 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.166391 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.166638 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.166651 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.166665 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.166684 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.178141 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.206607 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.229141 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.243469 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.263137 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.269444 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.269654 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.269794 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.269966 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.270120 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.281673 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.295031 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.315754 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.328573 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.343346 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.372320 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.372380 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.372402 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.372428 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.372447 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.475085 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.475125 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.475133 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.475150 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.475159 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.553798 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.553807 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.553832 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:00 crc kubenswrapper[4994]: E0310 00:08:00.553917 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.553842 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:00 crc kubenswrapper[4994]: E0310 00:08:00.554104 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:00 crc kubenswrapper[4994]: E0310 00:08:00.554240 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:00 crc kubenswrapper[4994]: E0310 00:08:00.554316 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.577532 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.577590 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.577610 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.577632 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.577651 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.680612 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.680681 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.680698 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.680728 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.680745 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.784010 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.784076 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.784097 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.784126 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.784149 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.886786 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.886840 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.886857 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.886909 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.886930 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.990277 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.990334 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.990351 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.990376 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4994]: I0310 00:08:00.990391 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.072976 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/1.log" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.078078 4994 scope.go:117] "RemoveContainer" containerID="eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c" Mar 10 00:08:01 crc kubenswrapper[4994]: E0310 00:08:01.078251 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.091671 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.093572 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.093711 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.093812 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.093941 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.094027 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.106306 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.118600 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.135934 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.152546 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.165478 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.189527 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.199519 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.199572 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.199588 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.199610 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.199625 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.208669 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.227590 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.243418 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.261570 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.275823 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.296537 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.301972 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.302014 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.302029 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.302049 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.302063 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.312631 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.329237 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.349026 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.378761 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"1.Namespace event handler 5 for removal\\\\nI0310 00:08:00.021673 6969 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:00.021681 6969 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:00.021702 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:00.021724 6969 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 00:08:00.021726 6969 factory.go:656] Stopping watch factory\\\\nI0310 00:08:00.021742 6969 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:00.021744 6969 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:00.021750 6969 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:00.021765 6969 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:00.021780 6969 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:00.021822 6969 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:00.022194 6969 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:00.022276 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.405407 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.405470 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.405487 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.405510 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.405526 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.508349 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.508406 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.508423 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.508446 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.508463 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.612016 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.612060 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.612076 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.612099 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.612116 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.715666 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.716025 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.716213 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.716357 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.716500 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.820926 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.821061 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.821146 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.821240 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.821271 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.924237 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.924364 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.924387 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.924416 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4994]: I0310 00:08:01.924437 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.026927 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.027102 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.027122 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.027150 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.027167 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.129508 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.129762 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.129890 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.129992 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.130078 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.232786 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.232911 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.232930 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.232952 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.232969 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.335852 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.335928 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.335945 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.335969 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.335987 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.439437 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.439497 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.439521 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.439547 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.439564 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.542172 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.542521 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.542659 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.542841 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.543023 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.553561 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.553609 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.553649 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.553619 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:02 crc kubenswrapper[4994]: E0310 00:08:02.553762 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:02 crc kubenswrapper[4994]: E0310 00:08:02.553986 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:02 crc kubenswrapper[4994]: E0310 00:08:02.554177 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:02 crc kubenswrapper[4994]: E0310 00:08:02.554228 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.645449 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.645531 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.645555 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.645583 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.645605 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.748414 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.748478 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.748502 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.748534 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.748579 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.851650 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.851747 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.851766 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.851824 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.851842 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.955271 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.955339 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.955355 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.955384 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4994]: I0310 00:08:02.955400 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.059279 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.059366 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.059389 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.059418 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.059442 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.162986 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.163035 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.163054 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.163077 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.163095 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.266799 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.266865 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.267103 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.267132 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.267152 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.370257 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.370321 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.370339 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.370370 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.370388 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.473092 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.473185 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.473203 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.473226 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.473245 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.581772 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.581835 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.581935 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.582539 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.582589 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.686097 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.686152 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.686170 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.686193 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.686210 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.788987 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.789041 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.789063 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.789092 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.789114 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.892504 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.892590 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.892673 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.892708 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.892867 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.996026 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.996072 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.996083 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.996098 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4994]: I0310 00:08:03.996123 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.101346 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.101386 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.101397 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.101411 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.101420 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.205322 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.205370 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.205386 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.205408 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.205425 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.308776 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.308822 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.308845 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.308909 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.308929 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.345773 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.345930 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.345970 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.346054 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.346152 4994 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.346217 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:20.346195848 +0000 UTC m=+114.519902637 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.346739 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:08:20.346714425 +0000 UTC m=+114.520421214 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.346901 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.346938 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.346980 4994 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.347027 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:20.347013606 +0000 UTC m=+114.520720395 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.347103 4994 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.347140 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:20.347128899 +0000 UTC m=+114.520835688 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.415630 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.415713 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.415737 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.415765 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.415790 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.446725 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.446829 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.447097 4994 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.447190 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs podName:f4c125b3-4a9c-46a7-a468-54e93c44751d nodeName:}" failed. No retries permitted until 2026-03-10 00:08:20.447160374 +0000 UTC m=+114.620867163 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs") pod "network-metrics-daemon-vxjt2" (UID: "f4c125b3-4a9c-46a7-a468-54e93c44751d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.447752 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.447797 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.447821 4994 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.447926 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:20.447901449 +0000 UTC m=+114.621608238 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.518962 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.519020 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.519040 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.519067 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.519088 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.553282 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.553452 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.553460 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.553543 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.553674 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.553758 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.553940 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:04 crc kubenswrapper[4994]: E0310 00:08:04.554116 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.622199 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.622239 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.622256 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.622300 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.622317 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.725537 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.725600 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.725623 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.725653 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.725674 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.828642 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.828720 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.828756 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.828786 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.828807 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.932063 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.932126 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.932144 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.932167 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4994]: I0310 00:08:04.932184 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.035700 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.035785 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.035807 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.035837 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.035864 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.139418 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.139476 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.139492 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.139515 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.139531 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.243734 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.243803 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.243824 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.243849 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.243868 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.346519 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.346565 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.346578 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.346596 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.346609 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.449521 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.449574 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.449591 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.449620 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.449640 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.551864 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.551943 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.551961 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.551983 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.552000 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.554411 4994 scope.go:117] "RemoveContainer" containerID="4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.655473 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.655538 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.655554 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.655578 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.655596 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.759416 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.759789 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.759810 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.759838 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.759857 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.864330 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.864396 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.864415 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.864450 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.864469 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.966780 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.966811 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.966820 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.966833 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4994]: I0310 00:08:05.966841 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.070516 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.071808 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.072032 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.072221 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.072386 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.113243 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.116004 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.117093 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.138693 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.161811 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"1.Namespace event handler 5 for removal\\\\nI0310 00:08:00.021673 6969 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:00.021681 6969 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:00.021702 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:00.021724 6969 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 00:08:00.021726 6969 factory.go:656] Stopping watch factory\\\\nI0310 00:08:00.021742 6969 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:00.021744 6969 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:00.021750 6969 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:00.021765 6969 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:00.021780 6969 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:00.021822 6969 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:00.022194 6969 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:00.022276 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.175408 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.175466 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.175484 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.175511 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.175528 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.176522 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.189804 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.208469 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.223772 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.239802 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.259576 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.278698 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.278973 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.279067 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.279170 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.279258 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.293902 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.315274 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.331696 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.349387 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.369711 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.381994 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.382040 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.382049 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.382064 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.382075 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.384118 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.409000 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.432502 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.451189 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.484457 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.484531 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.484555 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.484587 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.484611 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.553329 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.553453 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:06 crc kubenswrapper[4994]: E0310 00:08:06.553564 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.553612 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.553634 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:06 crc kubenswrapper[4994]: E0310 00:08:06.553679 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:06 crc kubenswrapper[4994]: E0310 00:08:06.553723 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:06 crc kubenswrapper[4994]: E0310 00:08:06.554017 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.570564 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.587138 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.587182 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.587195 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.587212 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.587223 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.592117 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.624573 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.638709 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.655091 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.671416 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.689332 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.689403 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.689427 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.689457 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.689479 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.694535 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.704331 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.722774 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.745206 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.775193 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"1.Namespace event handler 5 for removal\\\\nI0310 00:08:00.021673 6969 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:00.021681 6969 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:00.021702 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:00.021724 6969 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 00:08:00.021726 6969 factory.go:656] Stopping watch factory\\\\nI0310 00:08:00.021742 6969 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:00.021744 6969 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:00.021750 6969 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:00.021765 6969 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:00.021780 6969 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:00.021822 6969 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:00.022194 6969 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:00.022276 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.792962 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.793089 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.793142 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.793164 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.793193 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.793215 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.806655 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.824743 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.838934 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.851162 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.868361 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.895622 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.895779 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.896096 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.896299 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.896388 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.971638 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.971693 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.971709 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.971733 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.971751 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4994]: E0310 00:08:06.993610 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.998973 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.999216 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.999235 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.999649 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4994]: I0310 00:08:06.999709 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: E0310 00:08:07.022267 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.027163 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.027219 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.027237 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.027262 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.027279 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: E0310 00:08:07.048004 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.052740 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.052793 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.052810 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.052834 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.052852 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: E0310 00:08:07.072762 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.078044 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.078111 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.078131 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.078156 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.078173 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: E0310 00:08:07.098744 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4994]: E0310 00:08:07.099644 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.101862 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.102140 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.102362 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.102555 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.102757 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.206570 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.206647 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.206666 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.206691 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.206709 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.309821 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.309924 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.309945 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.309979 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.309997 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.415023 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.415082 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.415100 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.415123 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.415141 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.517962 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.518418 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.518635 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.518910 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.519069 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.622800 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.622852 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.622869 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.622918 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.622935 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.726648 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.726705 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.726723 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.726750 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.726767 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.830964 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.831017 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.831035 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.831065 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.831083 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.934318 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.934660 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.934806 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.934998 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4994]: I0310 00:08:07.935139 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.037934 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.038011 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.038037 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.038065 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.038086 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.140752 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.140805 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.140822 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.140846 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.140864 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.243541 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.243595 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.243612 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.243633 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.243650 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.346777 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.346823 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.346832 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.346849 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.346860 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.449562 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.449600 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.449609 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.449623 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.449633 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.553039 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.553165 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:08 crc kubenswrapper[4994]: E0310 00:08:08.553231 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:08 crc kubenswrapper[4994]: E0310 00:08:08.553390 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.553514 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:08 crc kubenswrapper[4994]: E0310 00:08:08.553603 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.553988 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:08 crc kubenswrapper[4994]: E0310 00:08:08.554186 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.554573 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.554814 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.555084 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.555321 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.555542 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.658477 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.658912 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.659054 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.659191 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.659354 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.762298 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.762690 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.762961 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.763146 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.763284 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.866204 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.866525 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.866653 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.866790 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.866987 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.970566 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.970620 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.970637 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.970660 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4994]: I0310 00:08:08.970676 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.074344 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.074405 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.074422 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.074448 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.074465 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.177454 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.177534 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.177554 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.177580 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.177599 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.280763 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.280821 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.280837 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.280862 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.280901 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.383778 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.383835 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.383851 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.383893 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.383910 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.487756 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.488107 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.488279 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.488422 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.488547 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.591325 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.591659 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.591803 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.591982 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.592107 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.695620 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.695685 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.695708 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.695736 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.695760 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.798136 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.798188 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.798206 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.798231 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.798248 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.900858 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.900990 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.901008 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.901034 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4994]: I0310 00:08:09.901052 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.004997 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.005063 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.005080 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.005102 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.005124 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.107987 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.108030 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.108042 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.108058 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.108069 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.216218 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.216287 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.216304 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.216327 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.216343 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.320177 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.320238 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.320259 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.320284 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.320301 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.423184 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.423263 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.423288 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.423316 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.423339 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.526293 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.526367 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.526392 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.526422 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.526447 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.553011 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.553065 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.553104 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:10 crc kubenswrapper[4994]: E0310 00:08:10.553200 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.553235 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:10 crc kubenswrapper[4994]: E0310 00:08:10.553391 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:10 crc kubenswrapper[4994]: E0310 00:08:10.553557 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:10 crc kubenswrapper[4994]: E0310 00:08:10.553668 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.629425 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.629475 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.629492 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.629515 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.629533 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.732571 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.732940 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.733110 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.733254 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.733386 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.836934 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.836992 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.837010 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.837035 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.837051 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.940133 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.940214 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.940237 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.940266 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4994]: I0310 00:08:10.940291 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.044455 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.044528 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.044552 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.044666 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.044694 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.147158 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.147545 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.147706 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.147846 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.148018 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.251227 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.251962 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.251995 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.252021 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.252042 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.354759 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.354824 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.354843 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.354908 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.354928 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.458162 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.458223 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.458239 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.458264 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.458283 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.560653 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.560701 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.560712 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.560728 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.560741 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.663571 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.663624 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.663642 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.663667 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.663684 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.766808 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.766862 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.766912 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.766935 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.766951 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.869952 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.870009 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.870027 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.870051 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.870070 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.973675 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.973739 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.973756 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.973782 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:11 crc kubenswrapper[4994]: I0310 00:08:11.973801 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:11Z","lastTransitionTime":"2026-03-10T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.076406 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.076469 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.076486 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.076512 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.076530 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.179402 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.179481 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.179502 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.179526 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.179545 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.282658 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.282733 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.282753 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.282778 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.282799 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.386321 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.386369 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.386380 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.386400 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.386411 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.489028 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.489131 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.489158 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.489263 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.489293 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.553661 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.553699 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.553746 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:12 crc kubenswrapper[4994]: E0310 00:08:12.553826 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:12 crc kubenswrapper[4994]: E0310 00:08:12.553947 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:12 crc kubenswrapper[4994]: E0310 00:08:12.554056 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.554013 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:12 crc kubenswrapper[4994]: E0310 00:08:12.554302 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.592565 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.592626 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.592643 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.592666 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.592685 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.695683 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.695732 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.695744 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.695762 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.695773 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.798753 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.798809 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.798826 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.798851 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.798867 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.902122 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.902166 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.902177 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.902199 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4994]: I0310 00:08:12.902211 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.005350 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.005395 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.005405 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.005423 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.005434 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.108235 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.108295 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.108311 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.108342 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.108360 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.211606 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.211684 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.211707 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.211735 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.211755 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.314944 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.315026 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.315046 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.315071 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.315087 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.418313 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.418402 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.418419 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.418441 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.418459 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.521779 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.521850 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.521910 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.522007 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.522032 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.624247 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.624282 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.624290 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.624303 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.624312 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.726771 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.726809 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.726816 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.726831 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.726840 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.829481 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.829539 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.829556 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.829579 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.829596 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.932259 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.932309 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.932328 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.932353 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4994]: I0310 00:08:13.932371 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.035793 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.035852 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.035945 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.035972 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.035990 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.138793 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.138861 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.138941 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.138971 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.138990 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.242178 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.242265 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.242282 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.242310 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.242328 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.345171 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.345238 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.345255 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.345277 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.345297 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.448585 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.448647 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.448666 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.448695 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.448714 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.551064 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.551118 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.551135 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.551159 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.551176 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.553526 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.553575 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.553565 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.553682 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:14 crc kubenswrapper[4994]: E0310 00:08:14.553849 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:14 crc kubenswrapper[4994]: E0310 00:08:14.554048 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:14 crc kubenswrapper[4994]: E0310 00:08:14.554183 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:14 crc kubenswrapper[4994]: E0310 00:08:14.554291 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.653472 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.653500 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.653508 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.653521 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.653529 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.756396 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.756453 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.756475 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.756503 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.756526 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.860008 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.860061 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.860077 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.860100 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.860116 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.962525 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.962613 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.962660 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.962685 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4994]: I0310 00:08:14.962702 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.065323 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.065375 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.065388 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.065407 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.065425 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.168421 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.168492 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.168517 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.168546 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.168568 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.271630 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.271701 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.271725 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.271756 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.271777 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.374267 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.374360 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.374378 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.374787 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.374839 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.478065 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.478113 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.478133 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.478156 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.478173 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.555298 4994 scope.go:117] "RemoveContainer" containerID="eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.580923 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.581225 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.581243 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.581266 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.581282 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.684172 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.684233 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.684254 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.684278 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.684295 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.786595 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.786642 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.786654 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.786671 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.786682 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.889636 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.889686 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.889701 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.889718 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.889730 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.992568 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.992658 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.992685 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.992716 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4994]: I0310 00:08:15.992742 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.094861 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.094939 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.094952 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.094965 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.094974 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.158846 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/1.log" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.163141 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.163863 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.176776 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.186657 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.197567 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.197623 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.197643 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.197673 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.197696 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.204833 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.218984 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.235130 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.250759 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.278411 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"1.Namespace event handler 5 for removal\\\\nI0310 00:08:00.021673 6969 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:00.021681 6969 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:00.021702 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:00.021724 6969 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 00:08:00.021726 6969 factory.go:656] Stopping watch factory\\\\nI0310 00:08:00.021742 6969 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:00.021744 6969 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:00.021750 6969 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:00.021765 6969 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:00.021780 6969 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:00.021822 6969 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:00.022194 6969 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:00.022276 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.292446 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.301049 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.301118 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.301140 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.301168 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.301190 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.304756 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.317508 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.330250 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.341700 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.357498 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.386063 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.402538 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.404380 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.404419 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.404429 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.404444 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.404453 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.421410 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.439335 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.506630 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.506705 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.506731 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.506758 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.506808 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.553752 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.553968 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.554007 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:16 crc kubenswrapper[4994]: E0310 00:08:16.553959 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:16 crc kubenswrapper[4994]: E0310 00:08:16.554236 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.554291 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:16 crc kubenswrapper[4994]: E0310 00:08:16.554431 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:16 crc kubenswrapper[4994]: E0310 00:08:16.554550 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.572546 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.596024 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"1.Namespace event handler 5 for removal\\\\nI0310 00:08:00.021673 6969 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:00.021681 6969 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:00.021702 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:00.021724 6969 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 00:08:00.021726 6969 factory.go:656] Stopping watch factory\\\\nI0310 00:08:00.021742 6969 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:00.021744 6969 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:00.021750 6969 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:00.021765 6969 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:00.021780 6969 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:00.021822 6969 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:00.022194 6969 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:00.022276 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.609530 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.609564 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.609572 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.609584 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.609592 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.613674 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.624713 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.637518 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.659114 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.677947 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.696914 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.714397 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.714441 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.714450 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.714465 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.714476 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.719539 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.740999 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.758580 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.776781 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.794264 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.810048 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.816928 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.817017 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.817043 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.817075 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.817104 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.834802 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.845910 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.859667 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.919893 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.919942 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.919953 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.919968 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4994]: I0310 00:08:16.919980 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.022747 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.022866 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.022939 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.022969 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.022991 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.126212 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.126261 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.126273 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.126290 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.126301 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.169121 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/2.log" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.169762 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/1.log" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.175484 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd" exitCode=1 Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.175544 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.175593 4994 scope.go:117] "RemoveContainer" containerID="eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.176183 4994 scope.go:117] "RemoveContainer" containerID="e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd" Mar 10 00:08:17 crc kubenswrapper[4994]: E0310 00:08:17.176324 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.198437 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.229059 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eed7799fc9c841d040416277397628b548ba647d4309197b61d0fc1d6deac69c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"1.Namespace event handler 5 for removal\\\\nI0310 00:08:00.021673 6969 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:00.021681 6969 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:00.021702 6969 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:00.021724 6969 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 00:08:00.021726 6969 factory.go:656] Stopping watch factory\\\\nI0310 00:08:00.021742 6969 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:00.021744 6969 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:00.021750 6969 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:00.021765 6969 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:00.021780 6969 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:00.021822 6969 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:00.022194 6969 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:00.022276 6969 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.229447 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.229480 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.229491 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.229508 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.229521 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.247542 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.262151 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.268869 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.268990 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.269016 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.269047 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.269071 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.281081 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: E0310 00:08:17.289676 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.297274 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.297335 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.297354 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.297383 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.297402 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.297917 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: E0310 00:08:17.314010 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.317706 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.319714 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.319789 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.319813 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.320074 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.320125 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.335971 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: E0310 00:08:17.341640 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.346568 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.346660 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.346678 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.346702 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.346720 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.356144 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: E0310 00:08:17.371259 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.376690 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.376745 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.376764 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.376787 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.376806 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.381094 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: E0310 00:08:17.399165 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: E0310 00:08:17.399502 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.401761 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.401807 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.401824 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.401844 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.401860 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.413218 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.432935 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.445734 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.461476 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.476469 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.490689 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.505143 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.505189 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.505209 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.505232 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.505249 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.510514 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.608605 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.608680 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.608701 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.608734 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.608756 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.711853 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.711942 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.711959 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.711983 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.712002 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.815185 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.815246 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.815264 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.815291 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.815308 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.918056 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.918096 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.918105 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.918120 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:17 crc kubenswrapper[4994]: I0310 00:08:17.918131 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:17Z","lastTransitionTime":"2026-03-10T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.021738 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.021791 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.021808 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.021831 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.021849 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.125094 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.125167 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.125187 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.125212 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.125233 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.181506 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/2.log" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.187372 4994 scope.go:117] "RemoveContainer" containerID="e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd" Mar 10 00:08:18 crc kubenswrapper[4994]: E0310 00:08:18.187631 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.204820 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.228395 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.229103 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.229172 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.229196 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.229228 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.229251 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.247553 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.267927 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.284758 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.300567 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.332263 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.332320 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.332342 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.332372 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.332390 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.335714 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.358843 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.377746 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.392376 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.411851 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.422554 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.435110 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.435188 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.435210 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.435238 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.435260 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.445513 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.460143 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.477713 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.498005 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.529657 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.537685 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.537759 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.537782 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.537807 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.537824 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.553669 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.553698 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:18 crc kubenswrapper[4994]: E0310 00:08:18.553853 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.553897 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.553985 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:18 crc kubenswrapper[4994]: E0310 00:08:18.554143 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:18 crc kubenswrapper[4994]: E0310 00:08:18.554245 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:18 crc kubenswrapper[4994]: E0310 00:08:18.554424 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.639998 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.640040 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.640052 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.640068 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.640082 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.742715 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.742766 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.742782 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.742804 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.742821 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.846119 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.846173 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.846190 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.846213 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.846229 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.949184 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.949251 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.949268 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.949293 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:18 crc kubenswrapper[4994]: I0310 00:08:18.949310 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:18Z","lastTransitionTime":"2026-03-10T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.052503 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.052573 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.052595 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.052621 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.052640 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.155041 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.155100 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.155119 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.155147 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.155165 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.257848 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.257938 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.257958 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.257984 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.258002 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.360901 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.360951 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.360969 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.360996 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.361014 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.464310 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.464370 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.464388 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.464416 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.464433 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.567945 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.568004 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.568026 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.568055 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.568077 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.670974 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.671044 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.671066 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.671095 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.671116 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.773864 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.773981 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.774001 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.774028 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.774047 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.877304 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.877446 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.877468 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.877501 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.877519 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.980557 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.980632 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.980651 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.980674 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:19 crc kubenswrapper[4994]: I0310 00:08:19.980691 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:19Z","lastTransitionTime":"2026-03-10T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.084059 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.084111 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.084128 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.084150 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.084167 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:20Z","lastTransitionTime":"2026-03-10T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.187247 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.187623 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.187779 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.188012 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.188224 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:20Z","lastTransitionTime":"2026-03-10T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.291657 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.291718 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.291740 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.291772 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.291793 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:20Z","lastTransitionTime":"2026-03-10T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.367993 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.368187 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:08:52.368149771 +0000 UTC m=+146.541856580 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.368291 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.368373 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.368427 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.368483 4994 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.368583 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:52.368556815 +0000 UTC m=+146.542263604 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.368590 4994 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.368661 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:52.368639688 +0000 UTC m=+146.542346467 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.368662 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.368737 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.368758 4994 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.368862 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:52.368828265 +0000 UTC m=+146.542535044 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.395248 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.395303 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.395325 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.395356 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.395385 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:20Z","lastTransitionTime":"2026-03-10T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.470322 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.470553 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.470627 4994 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.470776 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs podName:f4c125b3-4a9c-46a7-a468-54e93c44751d nodeName:}" failed. No retries permitted until 2026-03-10 00:08:52.470738973 +0000 UTC m=+146.644445912 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs") pod "network-metrics-daemon-vxjt2" (UID: "f4c125b3-4a9c-46a7-a468-54e93c44751d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.470846 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.470928 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.470958 4994 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.471098 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:52.471058504 +0000 UTC m=+146.644765453 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.498923 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.498997 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.499018 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.499052 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.499082 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:20Z","lastTransitionTime":"2026-03-10T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.553067 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.553242 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.553416 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.553415 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.553517 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.553666 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.553905 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:20 crc kubenswrapper[4994]: E0310 00:08:20.554041 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.601683 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.601733 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.601755 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.601784 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.601807 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:20Z","lastTransitionTime":"2026-03-10T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.706349 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.706423 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.706448 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.706481 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.706509 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:20Z","lastTransitionTime":"2026-03-10T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.809830 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.809937 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.809962 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.809991 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.810015 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:20Z","lastTransitionTime":"2026-03-10T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.912281 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.912355 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.912382 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.912413 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:20 crc kubenswrapper[4994]: I0310 00:08:20.912432 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:20Z","lastTransitionTime":"2026-03-10T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.017531 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.017597 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.017615 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.017641 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.017658 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.120319 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.120382 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.120400 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.120426 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.120448 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.223623 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.223712 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.223735 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.223759 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.223819 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.332502 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.332582 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.332603 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.332696 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.332761 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.436902 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.436999 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.437019 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.437046 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.437067 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.541216 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.541292 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.541312 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.541337 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.541356 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.644576 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.644636 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.644653 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.644678 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.644695 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.749623 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.749682 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.749700 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.749725 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.749743 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.852659 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.852734 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.852760 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.852832 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.852861 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.957052 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.957117 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.957135 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.957160 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:21 crc kubenswrapper[4994]: I0310 00:08:21.957180 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:21Z","lastTransitionTime":"2026-03-10T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.059941 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.060003 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.060021 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.060053 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.060071 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.162700 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.162782 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.162802 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.162826 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.162843 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.265988 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.266025 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.266056 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.266071 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.266080 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.369408 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.369486 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.369512 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.369545 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.369567 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.473422 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.473523 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.473546 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.473569 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.473587 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.554052 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:22 crc kubenswrapper[4994]: E0310 00:08:22.554285 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.554984 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:22 crc kubenswrapper[4994]: E0310 00:08:22.555085 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.555164 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:22 crc kubenswrapper[4994]: E0310 00:08:22.555258 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.555426 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:22 crc kubenswrapper[4994]: E0310 00:08:22.555517 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.577371 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.577431 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.577451 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.577473 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.577493 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.680865 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.680965 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.680982 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.681008 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.681027 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.785173 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.785242 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.785263 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.785288 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.785305 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.889963 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.890030 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.890050 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.890075 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.890094 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.994089 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.994147 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.994164 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.994187 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:22 crc kubenswrapper[4994]: I0310 00:08:22.994205 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:22Z","lastTransitionTime":"2026-03-10T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.097866 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.097957 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.097975 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.098002 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.098021 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:23Z","lastTransitionTime":"2026-03-10T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.202330 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.202393 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.202413 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.202441 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.202458 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:23Z","lastTransitionTime":"2026-03-10T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.305787 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.305835 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.305854 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.305910 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.305928 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:23Z","lastTransitionTime":"2026-03-10T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.409497 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.409547 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.409563 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.409586 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.409603 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:23Z","lastTransitionTime":"2026-03-10T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.513649 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.513705 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.513722 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.513753 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.513770 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:23Z","lastTransitionTime":"2026-03-10T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.617288 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.617357 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.617376 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.617401 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.617420 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:23Z","lastTransitionTime":"2026-03-10T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.720757 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.720841 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.720862 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.720923 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.720947 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:23Z","lastTransitionTime":"2026-03-10T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.824559 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.824629 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.824650 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.824676 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.824692 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:23Z","lastTransitionTime":"2026-03-10T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.843844 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.864507 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.886201 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.902130 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.925501 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.927570 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.927662 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.927695 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.927731 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.927757 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:23Z","lastTransitionTime":"2026-03-10T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.941354 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.959570 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:23 crc kubenswrapper[4994]: I0310 00:08:23.985255 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:23Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.005058 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.017830 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.027543 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.030987 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.031152 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.031171 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.031198 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.031240 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.041749 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.060497 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.078228 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.097918 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.123144 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.134387 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.134421 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.134432 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.134448 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.134459 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.145050 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.163460 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.237081 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.237205 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.237226 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.237250 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.237268 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.340679 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.340738 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.340756 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.340783 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.340800 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.443354 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.443410 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.443425 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.443445 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.443460 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.546763 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.546847 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.546860 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.546910 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.546925 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.553334 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.553373 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.553507 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.553597 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:24 crc kubenswrapper[4994]: E0310 00:08:24.553516 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:24 crc kubenswrapper[4994]: E0310 00:08:24.553756 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:24 crc kubenswrapper[4994]: E0310 00:08:24.553826 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:24 crc kubenswrapper[4994]: E0310 00:08:24.554007 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.651129 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.651205 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.651225 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.651251 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.651269 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.754611 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.754664 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.754684 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.754711 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.754729 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.857181 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.857602 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.857620 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.857645 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.857665 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.960281 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.960334 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.960350 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.960374 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:24 crc kubenswrapper[4994]: I0310 00:08:24.960390 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:24Z","lastTransitionTime":"2026-03-10T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.063781 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.063859 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.063913 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.063939 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.063957 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.166551 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.166619 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.166638 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.166659 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.166676 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.269531 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.269573 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.269585 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.269599 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.269608 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.373221 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.373561 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.373704 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.373858 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.374057 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.476899 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.477003 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.477023 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.477047 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.477064 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.580018 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.580096 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.580114 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.580141 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.580159 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.682707 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.682838 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.682859 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.682912 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.682934 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.789047 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.789224 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.789244 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.789317 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.789338 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.893074 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.893133 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.893149 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.893173 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.893190 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.996542 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.996647 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.996670 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.996738 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:25 crc kubenswrapper[4994]: I0310 00:08:25.996796 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:25Z","lastTransitionTime":"2026-03-10T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.101084 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.101171 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.101193 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.101225 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.101255 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:26Z","lastTransitionTime":"2026-03-10T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.204137 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.204201 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.204218 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.204243 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.204261 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:26Z","lastTransitionTime":"2026-03-10T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.307151 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.307428 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.307601 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.307751 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.307867 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:26Z","lastTransitionTime":"2026-03-10T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.410952 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.411030 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.411049 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.411073 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.411094 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:26Z","lastTransitionTime":"2026-03-10T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:26 crc kubenswrapper[4994]: E0310 00:08:26.516299 4994 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.553489 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.553866 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:26 crc kubenswrapper[4994]: E0310 00:08:26.554574 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.554024 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:26 crc kubenswrapper[4994]: E0310 00:08:26.554643 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.553965 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:26 crc kubenswrapper[4994]: E0310 00:08:26.555335 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:26 crc kubenswrapper[4994]: E0310 00:08:26.555065 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.577960 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.611605 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.631867 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.654109 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: E0310 00:08:26.656439 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.675454 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.691680 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.709604 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.731244 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.752538 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.774095 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.807758 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.830048 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.851041 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.870198 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.891407 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.907698 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4994]: I0310 00:08:26.933903 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.641517 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.641579 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.641600 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.641628 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.641648 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:27Z","lastTransitionTime":"2026-03-10T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:27 crc kubenswrapper[4994]: E0310 00:08:27.663640 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.669353 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.669410 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.669428 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.669452 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.669512 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:27Z","lastTransitionTime":"2026-03-10T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:27 crc kubenswrapper[4994]: E0310 00:08:27.690079 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.695421 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.695486 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.695506 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.695532 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.695549 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:27Z","lastTransitionTime":"2026-03-10T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:27 crc kubenswrapper[4994]: E0310 00:08:27.714767 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.719939 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.720059 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.720089 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.720119 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.720141 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:27Z","lastTransitionTime":"2026-03-10T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:27 crc kubenswrapper[4994]: E0310 00:08:27.740244 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.745349 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.745421 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.745445 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.745473 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:27 crc kubenswrapper[4994]: I0310 00:08:27.745495 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:27Z","lastTransitionTime":"2026-03-10T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:27 crc kubenswrapper[4994]: E0310 00:08:27.765769 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4994]: E0310 00:08:27.766017 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:08:28 crc kubenswrapper[4994]: I0310 00:08:28.553309 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:28 crc kubenswrapper[4994]: E0310 00:08:28.553477 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:28 crc kubenswrapper[4994]: I0310 00:08:28.553746 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:28 crc kubenswrapper[4994]: E0310 00:08:28.553847 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:28 crc kubenswrapper[4994]: I0310 00:08:28.554262 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:28 crc kubenswrapper[4994]: I0310 00:08:28.554396 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:28 crc kubenswrapper[4994]: E0310 00:08:28.554588 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:28 crc kubenswrapper[4994]: E0310 00:08:28.554750 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:30 crc kubenswrapper[4994]: I0310 00:08:30.553435 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:30 crc kubenswrapper[4994]: E0310 00:08:30.553624 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:30 crc kubenswrapper[4994]: I0310 00:08:30.553993 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:30 crc kubenswrapper[4994]: E0310 00:08:30.554112 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:30 crc kubenswrapper[4994]: I0310 00:08:30.554209 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:30 crc kubenswrapper[4994]: I0310 00:08:30.554225 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:30 crc kubenswrapper[4994]: E0310 00:08:30.554389 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:30 crc kubenswrapper[4994]: E0310 00:08:30.554537 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:31 crc kubenswrapper[4994]: E0310 00:08:31.658251 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:32 crc kubenswrapper[4994]: I0310 00:08:32.553510 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:32 crc kubenswrapper[4994]: I0310 00:08:32.553625 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:32 crc kubenswrapper[4994]: I0310 00:08:32.553525 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:32 crc kubenswrapper[4994]: I0310 00:08:32.553560 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:32 crc kubenswrapper[4994]: E0310 00:08:32.553740 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:32 crc kubenswrapper[4994]: E0310 00:08:32.553843 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:32 crc kubenswrapper[4994]: E0310 00:08:32.554107 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:32 crc kubenswrapper[4994]: E0310 00:08:32.554187 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:33 crc kubenswrapper[4994]: I0310 00:08:33.554466 4994 scope.go:117] "RemoveContainer" containerID="e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd" Mar 10 00:08:33 crc kubenswrapper[4994]: E0310 00:08:33.554770 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:08:34 crc kubenswrapper[4994]: I0310 00:08:34.553847 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:34 crc kubenswrapper[4994]: I0310 00:08:34.553941 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:34 crc kubenswrapper[4994]: I0310 00:08:34.553949 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:34 crc kubenswrapper[4994]: I0310 00:08:34.553861 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:34 crc kubenswrapper[4994]: E0310 00:08:34.554039 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:34 crc kubenswrapper[4994]: E0310 00:08:34.554202 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:34 crc kubenswrapper[4994]: E0310 00:08:34.554319 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:34 crc kubenswrapper[4994]: E0310 00:08:34.554443 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.254594 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/0.log" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.255711 4994 generic.go:334] "Generic (PLEG): container finished" podID="6dac87a5-07eb-488d-85fe-cb8848434ae5" containerID="5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c" exitCode=1 Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.255842 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mcxcb" event={"ID":"6dac87a5-07eb-488d-85fe-cb8848434ae5","Type":"ContainerDied","Data":"5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c"} Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.256592 4994 scope.go:117] "RemoveContainer" containerID="5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.281393 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.300225 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.317555 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"2026-03-10T00:07:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5\\\\n2026-03-10T00:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5 to /host/opt/cni/bin/\\\\n2026-03-10T00:07:51Z [verbose] multus-daemon started\\\\n2026-03-10T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.349824 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.372105 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.387815 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.404686 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.420909 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.435423 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.452895 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.482430 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.503413 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.521611 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.541640 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.553904 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.553961 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.554040 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.554183 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:36 crc kubenswrapper[4994]: E0310 00:08:36.554272 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:36 crc kubenswrapper[4994]: E0310 00:08:36.554379 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:36 crc kubenswrapper[4994]: E0310 00:08:36.554514 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:36 crc kubenswrapper[4994]: E0310 00:08:36.554634 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.559736 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.572907 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.585429 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.602716 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.617897 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.634955 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.650779 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: E0310 00:08:36.658865 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.666610 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.681123 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.717258 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.738051 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.755713 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.772702 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"2026-03-10T00:07:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5\\\\n2026-03-10T00:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5 to /host/opt/cni/bin/\\\\n2026-03-10T00:07:51Z [verbose] multus-daemon started\\\\n2026-03-10T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.790334 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.816865 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.834641 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.847895 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.862816 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.890479 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4994]: I0310 00:08:36.909007 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.263029 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/0.log" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.263127 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mcxcb" event={"ID":"6dac87a5-07eb-488d-85fe-cb8848434ae5","Type":"ContainerStarted","Data":"04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106"} Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.300515 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.322433 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.343967 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.362791 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.382517 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.402589 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.418488 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.436499 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.472486 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.493912 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.513830 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.534864 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"2026-03-10T00:07:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5\\\\n2026-03-10T00:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5 to /host/opt/cni/bin/\\\\n2026-03-10T00:07:51Z [verbose] multus-daemon started\\\\n2026-03-10T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.551340 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.574118 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.590412 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.607781 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.627700 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.796386 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.796439 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.796457 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.796483 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.796505 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:37Z","lastTransitionTime":"2026-03-10T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:37 crc kubenswrapper[4994]: E0310 00:08:37.819672 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.824899 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.824954 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.824975 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.825002 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.825023 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:37Z","lastTransitionTime":"2026-03-10T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:37 crc kubenswrapper[4994]: E0310 00:08:37.846464 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.850777 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.850841 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.850862 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.850918 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.850937 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:37Z","lastTransitionTime":"2026-03-10T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:37 crc kubenswrapper[4994]: E0310 00:08:37.868622 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.874529 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.874604 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.874626 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.874660 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.874682 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:37Z","lastTransitionTime":"2026-03-10T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:37 crc kubenswrapper[4994]: E0310 00:08:37.894518 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.899425 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.899482 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.899502 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.899531 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:37 crc kubenswrapper[4994]: I0310 00:08:37.899548 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:37Z","lastTransitionTime":"2026-03-10T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:37 crc kubenswrapper[4994]: E0310 00:08:37.919124 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:37 crc kubenswrapper[4994]: E0310 00:08:37.919352 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:08:38 crc kubenswrapper[4994]: I0310 00:08:38.553341 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:38 crc kubenswrapper[4994]: I0310 00:08:38.553452 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:38 crc kubenswrapper[4994]: I0310 00:08:38.553360 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:38 crc kubenswrapper[4994]: E0310 00:08:38.553565 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:38 crc kubenswrapper[4994]: I0310 00:08:38.553606 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:38 crc kubenswrapper[4994]: E0310 00:08:38.553951 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:38 crc kubenswrapper[4994]: E0310 00:08:38.554029 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:38 crc kubenswrapper[4994]: E0310 00:08:38.554073 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:40 crc kubenswrapper[4994]: I0310 00:08:40.553138 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:40 crc kubenswrapper[4994]: I0310 00:08:40.553198 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:40 crc kubenswrapper[4994]: I0310 00:08:40.553138 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:40 crc kubenswrapper[4994]: I0310 00:08:40.553297 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:40 crc kubenswrapper[4994]: E0310 00:08:40.553493 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:40 crc kubenswrapper[4994]: E0310 00:08:40.553923 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:40 crc kubenswrapper[4994]: E0310 00:08:40.554180 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:40 crc kubenswrapper[4994]: E0310 00:08:40.554347 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:40 crc kubenswrapper[4994]: I0310 00:08:40.569143 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 10 00:08:41 crc kubenswrapper[4994]: E0310 00:08:41.660945 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:42 crc kubenswrapper[4994]: I0310 00:08:42.553513 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:42 crc kubenswrapper[4994]: I0310 00:08:42.553644 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:42 crc kubenswrapper[4994]: E0310 00:08:42.553714 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:42 crc kubenswrapper[4994]: I0310 00:08:42.553738 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:42 crc kubenswrapper[4994]: E0310 00:08:42.553911 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:42 crc kubenswrapper[4994]: I0310 00:08:42.553952 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:42 crc kubenswrapper[4994]: E0310 00:08:42.554031 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:42 crc kubenswrapper[4994]: E0310 00:08:42.554169 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:44 crc kubenswrapper[4994]: I0310 00:08:44.553367 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:44 crc kubenswrapper[4994]: I0310 00:08:44.553397 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:44 crc kubenswrapper[4994]: I0310 00:08:44.553489 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:44 crc kubenswrapper[4994]: E0310 00:08:44.553549 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:44 crc kubenswrapper[4994]: E0310 00:08:44.553714 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:44 crc kubenswrapper[4994]: I0310 00:08:44.553755 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:44 crc kubenswrapper[4994]: E0310 00:08:44.553955 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:44 crc kubenswrapper[4994]: E0310 00:08:44.554396 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:44 crc kubenswrapper[4994]: I0310 00:08:44.568449 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.553131 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.553231 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:46 crc kubenswrapper[4994]: E0310 00:08:46.553306 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.553320 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.553338 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:46 crc kubenswrapper[4994]: E0310 00:08:46.553440 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:46 crc kubenswrapper[4994]: E0310 00:08:46.554253 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:46 crc kubenswrapper[4994]: E0310 00:08:46.554376 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.554534 4994 scope.go:117] "RemoveContainer" containerID="e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.589469 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.609617 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.630794 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45a7a43f-ba45-4f54-92a9-6fae6144cd7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558ec9f0b766ac2f80c3d721a99be111e4f7af9e09bd7880b4197525bc7d7406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:28.788269 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:28.790599 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:28.823559 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:28.826417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:58.399257 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:58.399347 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d813390aa8385ab9838a01b3f678e39dde836a7084291095d4582ca467b83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f52835a4ae8c3bb9dfd82d6d82a25994f8f5182fe9de997d39c5dd4561260f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.650041 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: E0310 00:08:46.663597 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.674293 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.693550 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.711282 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.724805 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.736746 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.768984 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.787965 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.800847 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.815912 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"2026-03-10T00:07:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5\\\\n2026-03-10T00:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5 to /host/opt/cni/bin/\\\\n2026-03-10T00:07:51Z [verbose] multus-daemon started\\\\n2026-03-10T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.833498 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d23bec6-0598-4970-8f25-e80867c2ad16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045232a99c43b7a39857ce2e6902de25546425a59bb0f9c7b076e6f0d9629f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0b61110f8d95cb59bdf543779a8e2787b6b9b8f10548fa643d5bfb41f24d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7cdb503feaf0be642fb32d4cac4a4ab28552f16b83033f22c9eacd90a623ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.847920 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.867685 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.880157 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.896526 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:46 crc kubenswrapper[4994]: I0310 00:08:46.914970 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.302009 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/2.log" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.305099 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.305575 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.319662 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.343326 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.364122 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45a7a43f-ba45-4f54-92a9-6fae6144cd7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558ec9f0b766ac2f80c3d721a99be111e4f7af9e09bd7880b4197525bc7d7406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:28.788269 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:28.790599 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:28.823559 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:28.826417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:58.399257 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:58.399347 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d813390aa8385ab9838a01b3f678e39dde836a7084291095d4582ca467b83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f52835a4ae8c3bb9dfd82d6d82a25994f8f5182fe9de997d39c5dd4561260f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.378206 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.392897 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.406853 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.418329 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.429265 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d23bec6-0598-4970-8f25-e80867c2ad16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045232a99c43b7a39857ce2e6902de25546425a59bb0f9c7b076e6f0d9629f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0b61110f8d95cb59bdf543779a8e2787b6b9b8f10548fa643d5bfb41f24d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7cdb503feaf0be642fb32d4cac4a4ab28552f16b83033f22c9eacd90a623ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.446117 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.467645 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.481733 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.498699 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"2026-03-10T00:07:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5\\\\n2026-03-10T00:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5 to /host/opt/cni/bin/\\\\n2026-03-10T00:07:51Z [verbose] multus-daemon started\\\\n2026-03-10T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.509198 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.520479 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.533998 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.544437 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.555904 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.569191 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4994]: I0310 00:08:47.588834 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.180418 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.180463 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.180473 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.180489 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.180502 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:48Z","lastTransitionTime":"2026-03-10T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.199040 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.203921 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.203969 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.203984 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.204002 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.204013 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:48Z","lastTransitionTime":"2026-03-10T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.217893 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.221884 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.221917 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.221925 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.221939 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.221948 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:48Z","lastTransitionTime":"2026-03-10T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.239732 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.244218 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.244261 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.244273 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.244289 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.244302 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:48Z","lastTransitionTime":"2026-03-10T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.261417 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.264822 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.264850 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.264859 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.264893 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.264903 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:48Z","lastTransitionTime":"2026-03-10T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.277280 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.277395 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.310334 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/3.log" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.311100 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/2.log" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.313927 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" exitCode=1 Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.313973 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.314016 4994 scope.go:117] "RemoveContainer" containerID="e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.315327 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.315553 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.334010 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.350527 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.368514 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.382412 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.398629 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.418208 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.446341 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e92a1ba5c65124e2aa96668b7df48d49b3db43ae113fa8b70e98c174450bf7cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"nding *v1.Node event handler 7 for removal\\\\nI0310 00:08:16.449147 7176 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:16.449144 7176 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:16.449167 7176 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:16.449186 7176 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:16.449189 7176 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:16.449246 7176 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:16.449272 7176 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:16.449290 7176 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:16.449419 7176 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449463 7176 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:16.449489 7176 factory.go:656] Stopping watch factory\\\\nI0310 00:08:16.449504 7176 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:16.449517 7176 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 00:08:16.449561 7176 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:16.449591 7176 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:16.449695 7176 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 00:08:47.724956 7508 services_controller.go:445] Built service openshift-dns/dns-default LB template configs for network=default: []services.lbConfig(nil)\\\\nI0310 00:08:47.724964 7508 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0310 00:08:47.724977 7508 services_controller.go:453] Built service openshift-kube-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF0310 00:08:47.724982 7508 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.459764 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.479163 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45a7a43f-ba45-4f54-92a9-6fae6144cd7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558ec9f0b766ac2f80c3d721a99be111e4f7af9e09bd7880b4197525bc7d7406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:28.788269 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:28.790599 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:28.823559 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:28.826417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:58.399257 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:58.399347 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d813390aa8385ab9838a01b3f678e39dde836a7084291095d4582ca467b83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f52835a4ae8c3bb9dfd82d6d82a25994f8f5182fe9de997d39c5dd4561260f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.503989 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.523292 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.540818 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.554128 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.554233 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.554269 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.554463 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.554550 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.554619 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.554725 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:48 crc kubenswrapper[4994]: E0310 00:08:48.554801 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.559234 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.575280 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.590353 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d23bec6-0598-4970-8f25-e80867c2ad16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045232a99c43b7a39857ce2e6902de25546425a59bb0f9c7b076e6f0d9629f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0b61110f8d95cb59bdf543779a8e2787b6b9b8f10548fa643d5bfb41f24d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7cdb503feaf0be642fb32d4cac4a4ab28552f16b83033f22c9eacd90a623ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.614262 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.635616 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.652350 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4994]: I0310 00:08:48.670490 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"2026-03-10T00:07:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5\\\\n2026-03-10T00:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5 to /host/opt/cni/bin/\\\\n2026-03-10T00:07:51Z [verbose] multus-daemon started\\\\n2026-03-10T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.321622 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/3.log" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.327568 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:08:49 crc kubenswrapper[4994]: E0310 00:08:49.327949 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.347212 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.366739 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"2026-03-10T00:07:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5\\\\n2026-03-10T00:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5 to /host/opt/cni/bin/\\\\n2026-03-10T00:07:51Z [verbose] multus-daemon started\\\\n2026-03-10T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.385364 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d23bec6-0598-4970-8f25-e80867c2ad16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045232a99c43b7a39857ce2e6902de25546425a59bb0f9c7b076e6f0d9629f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0b61110f8d95cb59bdf543779a8e2787b6b9b8f10548fa643d5bfb41f24d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7cdb503feaf0be642fb32d4cac4a4ab28552f16b83033f22c9eacd90a623ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.416220 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.438900 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.455211 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.472259 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.492746 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.507473 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.530134 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.550077 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.580624 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 00:08:47.724956 7508 services_controller.go:445] Built service openshift-dns/dns-default LB template configs for network=default: []services.lbConfig(nil)\\\\nI0310 00:08:47.724964 7508 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0310 00:08:47.724977 7508 services_controller.go:453] Built service openshift-kube-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF0310 00:08:47.724982 7508 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.598817 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.618148 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.633931 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.646464 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.662670 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.683246 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45a7a43f-ba45-4f54-92a9-6fae6144cd7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558ec9f0b766ac2f80c3d721a99be111e4f7af9e09bd7880b4197525bc7d7406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:28.788269 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:28.790599 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:28.823559 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:28.826417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:58.399257 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:58.399347 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d813390aa8385ab9838a01b3f678e39dde836a7084291095d4582ca467b83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f52835a4ae8c3bb9dfd82d6d82a25994f8f5182fe9de997d39c5dd4561260f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4994]: I0310 00:08:49.733241 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:50 crc kubenswrapper[4994]: I0310 00:08:50.553228 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:50 crc kubenswrapper[4994]: I0310 00:08:50.553332 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:50 crc kubenswrapper[4994]: E0310 00:08:50.553433 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:50 crc kubenswrapper[4994]: I0310 00:08:50.553454 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:50 crc kubenswrapper[4994]: E0310 00:08:50.553622 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:50 crc kubenswrapper[4994]: I0310 00:08:50.553751 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:50 crc kubenswrapper[4994]: E0310 00:08:50.554030 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:50 crc kubenswrapper[4994]: E0310 00:08:50.554120 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:51 crc kubenswrapper[4994]: E0310 00:08:51.664634 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.414493 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.414744 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.414832 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.414789359 +0000 UTC m=+210.588496148 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.414869 4994 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.414966 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.415011 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.414983403 +0000 UTC m=+210.588690182 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.415049 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.415144 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.415200 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.415224 4994 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.415227 4994 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.415298 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.41528213 +0000 UTC m=+210.588988909 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.415359 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.41531892 +0000 UTC m=+210.589025679 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.516689 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.516943 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.517035 4994 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.517217 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs podName:f4c125b3-4a9c-46a7-a468-54e93c44751d nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.517177258 +0000 UTC m=+210.690884187 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs") pod "network-metrics-daemon-vxjt2" (UID: "f4c125b3-4a9c-46a7-a468-54e93c44751d") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.517246 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.517304 4994 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.517326 4994 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.517440 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.517406253 +0000 UTC m=+210.691113152 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.553597 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.553643 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.553659 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:52 crc kubenswrapper[4994]: I0310 00:08:52.553660 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.553935 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.554083 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.554297 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:52 crc kubenswrapper[4994]: E0310 00:08:52.554395 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:54 crc kubenswrapper[4994]: I0310 00:08:54.553513 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:54 crc kubenswrapper[4994]: E0310 00:08:54.554168 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:54 crc kubenswrapper[4994]: I0310 00:08:54.553692 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:54 crc kubenswrapper[4994]: E0310 00:08:54.554288 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:54 crc kubenswrapper[4994]: I0310 00:08:54.553706 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:54 crc kubenswrapper[4994]: E0310 00:08:54.554406 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:54 crc kubenswrapper[4994]: I0310 00:08:54.553611 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:54 crc kubenswrapper[4994]: E0310 00:08:54.554494 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.553341 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.553448 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:56 crc kubenswrapper[4994]: E0310 00:08:56.553572 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.553380 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:56 crc kubenswrapper[4994]: E0310 00:08:56.553743 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.553852 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:56 crc kubenswrapper[4994]: E0310 00:08:56.553996 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:56 crc kubenswrapper[4994]: E0310 00:08:56.554088 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.575493 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.598627 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.618670 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.635907 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.652029 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: E0310 00:08:56.665279 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.667589 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.690992 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45a7a43f-ba45-4f54-92a9-6fae6144cd7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558ec9f0b766ac2f80c3d721a99be111e4f7af9e09bd7880b4197525bc7d7406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:28.788269 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:28.790599 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:28.823559 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:28.826417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:58.399257 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:58.399347 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d813390aa8385ab9838a01b3f678e39dde836a7084291095d4582ca467b83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f52835a4ae8c3bb9dfd82d6d82a25994f8f5182fe9de997d39c5dd4561260f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.712148 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.731560 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.770720 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"2026-03-10T00:07:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5\\\\n2026-03-10T00:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5 to /host/opt/cni/bin/\\\\n2026-03-10T00:07:51Z [verbose] multus-daemon started\\\\n2026-03-10T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.794689 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d23bec6-0598-4970-8f25-e80867c2ad16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045232a99c43b7a39857ce2e6902de25546425a59bb0f9c7b076e6f0d9629f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0b61110f8d95cb59bdf543779a8e2787b6b9b8f10548fa643d5bfb41f24d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7cdb503feaf0be642fb32d4cac4a4ab28552f16b83033f22c9eacd90a623ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.828259 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.851758 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.871275 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.888176 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.907626 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.920314 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.939024 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:56 crc kubenswrapper[4994]: I0310 00:08:56.970905 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 00:08:47.724956 7508 services_controller.go:445] Built service openshift-dns/dns-default LB template configs for network=default: []services.lbConfig(nil)\\\\nI0310 00:08:47.724964 7508 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0310 00:08:47.724977 7508 services_controller.go:453] Built service openshift-kube-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF0310 00:08:47.724982 7508 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.546278 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.546333 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.546378 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.546398 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.546409 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:58Z","lastTransitionTime":"2026-03-10T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.554352 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.554358 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.554510 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.554700 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.554977 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.555059 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.555232 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.556700 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.567312 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.573712 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.573767 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.573788 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.573817 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.573836 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:58Z","lastTransitionTime":"2026-03-10T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.595186 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.600366 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.600417 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.600428 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.600454 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.600470 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:58Z","lastTransitionTime":"2026-03-10T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.620632 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.625752 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.625808 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.625826 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.625856 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.625905 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:58Z","lastTransitionTime":"2026-03-10T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.647827 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.653705 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.653762 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.653780 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.653804 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:58 crc kubenswrapper[4994]: I0310 00:08:58.653821 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:58Z","lastTransitionTime":"2026-03-10T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.674838 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:58 crc kubenswrapper[4994]: E0310 00:08:58.675122 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:09:00 crc kubenswrapper[4994]: I0310 00:09:00.553828 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:00 crc kubenswrapper[4994]: I0310 00:09:00.553926 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:00 crc kubenswrapper[4994]: I0310 00:09:00.553850 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:00 crc kubenswrapper[4994]: I0310 00:09:00.554003 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:00 crc kubenswrapper[4994]: E0310 00:09:00.554054 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:00 crc kubenswrapper[4994]: E0310 00:09:00.554282 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:00 crc kubenswrapper[4994]: E0310 00:09:00.554509 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:00 crc kubenswrapper[4994]: E0310 00:09:00.554579 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:01 crc kubenswrapper[4994]: E0310 00:09:01.666550 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:09:02 crc kubenswrapper[4994]: I0310 00:09:02.553335 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:02 crc kubenswrapper[4994]: I0310 00:09:02.553460 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:02 crc kubenswrapper[4994]: E0310 00:09:02.553748 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:02 crc kubenswrapper[4994]: I0310 00:09:02.553799 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:02 crc kubenswrapper[4994]: I0310 00:09:02.553833 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:02 crc kubenswrapper[4994]: E0310 00:09:02.554058 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:02 crc kubenswrapper[4994]: E0310 00:09:02.554497 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:02 crc kubenswrapper[4994]: E0310 00:09:02.554746 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:02 crc kubenswrapper[4994]: I0310 00:09:02.555925 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:09:02 crc kubenswrapper[4994]: E0310 00:09:02.556176 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:09:04 crc kubenswrapper[4994]: I0310 00:09:04.553584 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:04 crc kubenswrapper[4994]: I0310 00:09:04.553693 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:04 crc kubenswrapper[4994]: E0310 00:09:04.553779 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:04 crc kubenswrapper[4994]: E0310 00:09:04.553908 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:04 crc kubenswrapper[4994]: I0310 00:09:04.554029 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:04 crc kubenswrapper[4994]: E0310 00:09:04.554172 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:04 crc kubenswrapper[4994]: I0310 00:09:04.554419 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:04 crc kubenswrapper[4994]: E0310 00:09:04.554588 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.553423 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.553610 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.553855 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:06 crc kubenswrapper[4994]: E0310 00:09:06.553845 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.553932 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:06 crc kubenswrapper[4994]: E0310 00:09:06.554065 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:06 crc kubenswrapper[4994]: E0310 00:09:06.554485 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:06 crc kubenswrapper[4994]: E0310 00:09:06.554626 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.574683 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e28954a3d2cb65c7806fda13f27ecfb48ef9431f17c9d7261dc9550320b116d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab52696526f8810c62addcf742d6fd6cd612c04afce4fcf5757283e376d75c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.606181 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72a13a81-4c11-4529-8a3d-2dd3c73215a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 00:08:47.724956 7508 services_controller.go:445] Built service openshift-dns/dns-default LB template configs for network=default: []services.lbConfig(nil)\\\\nI0310 00:08:47.724964 7508 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0310 00:08:47.724977 7508 services_controller.go:453] Built service openshift-kube-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF0310 00:08:47.724982 7508 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s42gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ns797\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.627253 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.645557 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70c9019a621effac3d3b64c12f85da1f943d4ecd69e5edebedfc016e4dd201d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.664745 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: E0310 00:09:06.672477 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.683077 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ced5d66d-39df-4267-b801-e1e60d517ace\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14a301971e2b48af554ccf8c6dea4ff926e1f95181aa51bb233686a7aecc7c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9r5sl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kfljj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.699547 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jhp6z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c9fd1f0-58d6-4986-86b5-8c26c871e79b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cc5b9e339d7c7c6bee9dbeb07870ce2a5b7a5f29c679538592ad53f57a04e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwkj5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jhp6z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.714625 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b21ff3c0-eaf7-406e-b698-7d99bbfbf1a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3952ae64d04da5dbe8de79d57a58f02d7440a30c2a16df6264adf2a0e29d573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76b163cdcf3bab299b5b3123d67864038fbef9c91cb4b7ae31838725d7a40f99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.735665 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45a7a43f-ba45-4f54-92a9-6fae6144cd7f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558ec9f0b766ac2f80c3d721a99be111e4f7af9e09bd7880b4197525bc7d7406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2081481365deafe7b15dcdbe3ef7227353473a4e2924ecaf02c9d294d80b5bb1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:28.788269 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:28.790599 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:28.823559 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:28.826417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:58.399257 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:58.399347 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c6d813390aa8385ab9838a01b3f678e39dde836a7084291095d4582ca467b83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f52835a4ae8c3bb9dfd82d6d82a25994f8f5182fe9de997d39c5dd4561260f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.757341 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:07:15Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:07:15.409957 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:07:15.410087 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:07:15.411115 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2081040014/tls.crt::/tmp/serving-cert-2081040014/tls.key\\\\\\\"\\\\nI0310 00:07:15.621689 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:07:15.625337 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:07:15.625380 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:07:15.625401 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:07:15.625408 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:07:15.631798 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0310 00:07:15.631825 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 00:07:15.631836 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631849 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:07:15.631860 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:07:15.631867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:07:15.631902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:07:15.631910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 00:07:15.632770 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.777528 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.799331 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mcxcb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dac87a5-07eb-488d-85fe-cb8848434ae5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"2026-03-10T00:07:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5\\\\n2026-03-10T00:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f0daa6e0-6b0a-4f70-8f04-e301f16e6ce5 to /host/opt/cni/bin/\\\\n2026-03-10T00:07:51Z [verbose] multus-daemon started\\\\n2026-03-10T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mcxcb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.816533 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d23bec6-0598-4970-8f25-e80867c2ad16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://045232a99c43b7a39857ce2e6902de25546425a59bb0f9c7b076e6f0d9629f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c0b61110f8d95cb59bdf543779a8e2787b6b9b8f10548fa643d5bfb41f24d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7cdb503feaf0be642fb32d4cac4a4ab28552f16b83033f22c9eacd90a623ce3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4cbc4ecd8b21fb884e5bd4518fe21dc1f48ddb7b04085618c3f8210b7cbc1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.848212 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2be9b02-cd28-4406-a993-86d081914e55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5836da1e1c61bdfecb55eaa300bb45306f99cfc81d03b8a45f1ac656d1302176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c3eb13ee41353cc882a69316fb709e4bafffb42d22dd408341731bbf70f8ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a8015c2493da15d96602871209e3533f4ac6095c809d3a61ef54416fa1fef04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb41cc82d27c13f66f85d613f0b96a4ddd10fb78bf1d997352ad8db2f1f829e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34af8ea68e0f050177cef50e1f3e45a7e4838d0b24c582515886a5de11522637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd624c77af6bd56f254a3d1774e6608831279afce70c30ad2b9320558ffaf5c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac6f378d76567abc2513f792c5768d74effa322a707f9792a4dbbb65aa29aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d227a7f529956db6d1154d2ae32dc58d7285c53ef1da7b00ce9bc6486db3010\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.868631 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fa5eb21-ad3b-4258-bd65-3f8dcd86ba08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://244f5d921db73d37589d909a530767707c532e6bff997d3616d3078c683f47f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23c266de2cfdeac7d9472096e85aea25d495f5e8d942868d49bb5e6cc0fc3cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54e567acc7942a1a36654f53b0c65b8a0cec4e0aa28c108dbc34b98052967b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42def778b8465e354cc812be1187351dd8f2115e276e35505a45e7c745cd877\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef332b89673c40c431b607c3e9e77f5fbe50446cd18040eb0b78bc5b9dcb1e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59c7de93a6e1c7f323ea98b6726485242efb02c49960b2902afa3194f0e8a8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74055118e5e3408aab2f5038f76a3ca083bd253f4ebede33f3736558373303a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dkqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2f6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.883497 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c125b3-4a9c-46a7-a468-54e93c44751d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5gdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxjt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.899158 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd1d8032-7c65-474f-9a19-a93bf0cac8ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://049e811f0c2fbd31afd17698b8a6db981eb82861ee58336ae6fac90b6cff14b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a321f7923a3b4078fba4f15b6ba4072ec221648ed5915843ef9771374396c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfsnl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-d28jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.915920 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://19b4e1d251094933c0c3d64d3f217d7b231f95e83148137b2e030fb649d53b35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:06 crc kubenswrapper[4994]: I0310 00:09:06.930821 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-24l69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"194b252b-4eca-42f4-85e1-5c51a42eb407\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f77053d7028cf7513cc54ee0aeae0c25f0252107a878270c0066098ffba09f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrsmg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-24l69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.553761 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.553867 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.553943 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.553944 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.555481 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.555651 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.556012 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.556116 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.680012 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.680101 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.680123 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.680150 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.680169 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:09:08Z","lastTransitionTime":"2026-03-10T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.702705 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.709365 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.709460 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.709489 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.709520 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.709545 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:09:08Z","lastTransitionTime":"2026-03-10T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.727918 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.734171 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.734336 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.734460 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.734573 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.734668 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:09:08Z","lastTransitionTime":"2026-03-10T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.756479 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.761430 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.761496 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.761515 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.761542 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.761562 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:09:08Z","lastTransitionTime":"2026-03-10T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.782752 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.788208 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.788264 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.788283 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.788312 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:09:08 crc kubenswrapper[4994]: I0310 00:09:08.788333 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:09:08Z","lastTransitionTime":"2026-03-10T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.808355 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9894519f-677e-4b1e-80a1-f7e7d58a0619\\\",\\\"systemUUID\\\":\\\"c9a6b1d9-12bb-4e1d-8072-25b4f73868f8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:09:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:09:08 crc kubenswrapper[4994]: E0310 00:09:08.808582 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:09:10 crc kubenswrapper[4994]: I0310 00:09:10.553310 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:10 crc kubenswrapper[4994]: I0310 00:09:10.553396 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:10 crc kubenswrapper[4994]: I0310 00:09:10.553471 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:10 crc kubenswrapper[4994]: E0310 00:09:10.553679 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:10 crc kubenswrapper[4994]: I0310 00:09:10.553749 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:10 crc kubenswrapper[4994]: E0310 00:09:10.553853 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:10 crc kubenswrapper[4994]: E0310 00:09:10.554122 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:10 crc kubenswrapper[4994]: E0310 00:09:10.554168 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:11 crc kubenswrapper[4994]: E0310 00:09:11.674629 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:09:12 crc kubenswrapper[4994]: I0310 00:09:12.553434 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:12 crc kubenswrapper[4994]: I0310 00:09:12.553523 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:12 crc kubenswrapper[4994]: E0310 00:09:12.553618 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:12 crc kubenswrapper[4994]: I0310 00:09:12.553643 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:12 crc kubenswrapper[4994]: E0310 00:09:12.553788 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:12 crc kubenswrapper[4994]: I0310 00:09:12.553857 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:12 crc kubenswrapper[4994]: E0310 00:09:12.553983 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:12 crc kubenswrapper[4994]: E0310 00:09:12.554076 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:13 crc kubenswrapper[4994]: I0310 00:09:13.554490 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:09:13 crc kubenswrapper[4994]: E0310 00:09:13.554762 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:09:14 crc kubenswrapper[4994]: I0310 00:09:14.553088 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:14 crc kubenswrapper[4994]: I0310 00:09:14.553091 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:14 crc kubenswrapper[4994]: I0310 00:09:14.553224 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:14 crc kubenswrapper[4994]: I0310 00:09:14.553511 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:14 crc kubenswrapper[4994]: E0310 00:09:14.553688 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:14 crc kubenswrapper[4994]: E0310 00:09:14.554234 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:14 crc kubenswrapper[4994]: E0310 00:09:14.554534 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:14 crc kubenswrapper[4994]: E0310 00:09:14.554568 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.554168 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.554276 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.554739 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.554852 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:16 crc kubenswrapper[4994]: E0310 00:09:16.555059 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:16 crc kubenswrapper[4994]: E0310 00:09:16.555171 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:16 crc kubenswrapper[4994]: E0310 00:09:16.555249 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:16 crc kubenswrapper[4994]: E0310 00:09:16.555512 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.604433 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-d28jn" podStartSLOduration=124.604399043 podStartE2EDuration="2m4.604399043s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.604046756 +0000 UTC m=+170.777753545" watchObservedRunningTime="2026-03-10 00:09:16.604399043 +0000 UTC m=+170.778105832" Mar 10 00:09:16 crc kubenswrapper[4994]: E0310 00:09:16.676029 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.681543 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-24l69" podStartSLOduration=125.681516187 podStartE2EDuration="2m5.681516187s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.648909754 +0000 UTC m=+170.822616543" watchObservedRunningTime="2026-03-10 00:09:16.681516187 +0000 UTC m=+170.855222976" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.709817 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-b2f6h" podStartSLOduration=125.709774476 podStartE2EDuration="2m5.709774476s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.682399576 +0000 UTC m=+170.856106365" watchObservedRunningTime="2026-03-10 00:09:16.709774476 +0000 UTC m=+170.883481275" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.836154 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jhp6z" podStartSLOduration=125.836118702 podStartE2EDuration="2m5.836118702s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.835917738 +0000 UTC m=+171.009624547" watchObservedRunningTime="2026-03-10 00:09:16.836118702 +0000 UTC m=+171.009825491" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.836783 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podStartSLOduration=125.836772677 podStartE2EDuration="2m5.836772677s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.819566386 +0000 UTC m=+170.993273175" watchObservedRunningTime="2026-03-10 00:09:16.836772677 +0000 UTC m=+171.010479466" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.850504 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=81.85047612299999 podStartE2EDuration="1m21.850476123s" podCreationTimestamp="2026-03-10 00:07:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.850307979 +0000 UTC m=+171.024014768" watchObservedRunningTime="2026-03-10 00:09:16.850476123 +0000 UTC m=+171.024182912" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.874984 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=32.87495716 podStartE2EDuration="32.87495716s" podCreationTimestamp="2026-03-10 00:08:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.874587993 +0000 UTC m=+171.048294742" watchObservedRunningTime="2026-03-10 00:09:16.87495716 +0000 UTC m=+171.048663939" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.920059 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mcxcb" podStartSLOduration=125.920035483 podStartE2EDuration="2m5.920035483s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.919807908 +0000 UTC m=+171.093514667" watchObservedRunningTime="2026-03-10 00:09:16.920035483 +0000 UTC m=+171.093742232" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.933108 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=36.933074955 podStartE2EDuration="36.933074955s" podCreationTimestamp="2026-03-10 00:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.932584184 +0000 UTC m=+171.106290943" watchObservedRunningTime="2026-03-10 00:09:16.933074955 +0000 UTC m=+171.106781714" Mar 10 00:09:16 crc kubenswrapper[4994]: I0310 00:09:16.957949 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=77.957921591 podStartE2EDuration="1m17.957921591s" podCreationTimestamp="2026-03-10 00:07:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.956164103 +0000 UTC m=+171.129870862" watchObservedRunningTime="2026-03-10 00:09:16.957921591 +0000 UTC m=+171.131628350" Mar 10 00:09:18 crc kubenswrapper[4994]: I0310 00:09:18.553505 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:18 crc kubenswrapper[4994]: I0310 00:09:18.553593 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:18 crc kubenswrapper[4994]: I0310 00:09:18.553596 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:18 crc kubenswrapper[4994]: E0310 00:09:18.553682 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:18 crc kubenswrapper[4994]: I0310 00:09:18.553719 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:18 crc kubenswrapper[4994]: E0310 00:09:18.554078 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:18 crc kubenswrapper[4994]: E0310 00:09:18.554113 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:18 crc kubenswrapper[4994]: E0310 00:09:18.554278 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:18 crc kubenswrapper[4994]: I0310 00:09:18.993374 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:09:18 crc kubenswrapper[4994]: I0310 00:09:18.993443 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:09:18 crc kubenswrapper[4994]: I0310 00:09:18.993463 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:09:18 crc kubenswrapper[4994]: I0310 00:09:18.993493 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:09:18 crc kubenswrapper[4994]: I0310 00:09:18.993522 4994 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:09:18Z","lastTransitionTime":"2026-03-10T00:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.064053 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=88.064024256 podStartE2EDuration="1m28.064024256s" podCreationTimestamp="2026-03-10 00:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.972208969 +0000 UTC m=+171.145915728" watchObservedRunningTime="2026-03-10 00:09:19.064024256 +0000 UTC m=+173.237731045" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.067222 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c"] Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.068063 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.069906 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.070966 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.071859 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.072938 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.150744 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8f78cac8-8497-4562-a457-3650bda3763b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.150855 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f78cac8-8497-4562-a457-3650bda3763b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.150942 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f78cac8-8497-4562-a457-3650bda3763b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.151011 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8f78cac8-8497-4562-a457-3650bda3763b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.151071 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f78cac8-8497-4562-a457-3650bda3763b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.251617 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f78cac8-8497-4562-a457-3650bda3763b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.251931 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8f78cac8-8497-4562-a457-3650bda3763b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.251983 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f78cac8-8497-4562-a457-3650bda3763b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.252015 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f78cac8-8497-4562-a457-3650bda3763b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.252077 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8f78cac8-8497-4562-a457-3650bda3763b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.252174 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8f78cac8-8497-4562-a457-3650bda3763b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.252389 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8f78cac8-8497-4562-a457-3650bda3763b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.252796 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f78cac8-8497-4562-a457-3650bda3763b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.261690 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f78cac8-8497-4562-a457-3650bda3763b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.281270 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f78cac8-8497-4562-a457-3650bda3763b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jxn8c\" (UID: \"8f78cac8-8497-4562-a457-3650bda3763b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.393035 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" Mar 10 00:09:19 crc kubenswrapper[4994]: W0310 00:09:19.411923 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f78cac8_8497_4562_a457_3650bda3763b.slice/crio-fcb58fc6c527be484e4db272d9983b77c5ceadd4b8ec45272617a375f879b216 WatchSource:0}: Error finding container fcb58fc6c527be484e4db272d9983b77c5ceadd4b8ec45272617a375f879b216: Status 404 returned error can't find the container with id fcb58fc6c527be484e4db272d9983b77c5ceadd4b8ec45272617a375f879b216 Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.466398 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" event={"ID":"8f78cac8-8497-4562-a457-3650bda3763b","Type":"ContainerStarted","Data":"fcb58fc6c527be484e4db272d9983b77c5ceadd4b8ec45272617a375f879b216"} Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.585801 4994 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 10 00:09:19 crc kubenswrapper[4994]: I0310 00:09:19.597539 4994 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 00:09:20 crc kubenswrapper[4994]: I0310 00:09:20.473206 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" event={"ID":"8f78cac8-8497-4562-a457-3650bda3763b","Type":"ContainerStarted","Data":"be745be1c0f181666631c98c9f488824f4fe59f9cfa59e72ee16f5e487265cca"} Mar 10 00:09:20 crc kubenswrapper[4994]: I0310 00:09:20.497305 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jxn8c" podStartSLOduration=129.497280991 podStartE2EDuration="2m9.497280991s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:20.497111518 +0000 UTC m=+174.670818297" watchObservedRunningTime="2026-03-10 00:09:20.497280991 +0000 UTC m=+174.670987760" Mar 10 00:09:20 crc kubenswrapper[4994]: I0310 00:09:20.553958 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:20 crc kubenswrapper[4994]: I0310 00:09:20.554044 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:20 crc kubenswrapper[4994]: E0310 00:09:20.554144 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:20 crc kubenswrapper[4994]: I0310 00:09:20.553985 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:20 crc kubenswrapper[4994]: E0310 00:09:20.554300 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:20 crc kubenswrapper[4994]: E0310 00:09:20.554577 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:20 crc kubenswrapper[4994]: I0310 00:09:20.554793 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:20 crc kubenswrapper[4994]: E0310 00:09:20.554967 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:21 crc kubenswrapper[4994]: E0310 00:09:21.677670 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.483904 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/1.log" Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.484617 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/0.log" Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.484690 4994 generic.go:334] "Generic (PLEG): container finished" podID="6dac87a5-07eb-488d-85fe-cb8848434ae5" containerID="04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106" exitCode=1 Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.484733 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mcxcb" event={"ID":"6dac87a5-07eb-488d-85fe-cb8848434ae5","Type":"ContainerDied","Data":"04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106"} Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.484781 4994 scope.go:117] "RemoveContainer" containerID="5fd397ab4fc06b77ee8da4969bf7e109c78e3fc4bca230e237786995d211b70c" Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.485419 4994 scope.go:117] "RemoveContainer" containerID="04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106" Mar 10 00:09:22 crc kubenswrapper[4994]: E0310 00:09:22.485690 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-mcxcb_openshift-multus(6dac87a5-07eb-488d-85fe-cb8848434ae5)\"" pod="openshift-multus/multus-mcxcb" podUID="6dac87a5-07eb-488d-85fe-cb8848434ae5" Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.579532 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.579605 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.579680 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:22 crc kubenswrapper[4994]: E0310 00:09:22.579697 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:22 crc kubenswrapper[4994]: I0310 00:09:22.579778 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:22 crc kubenswrapper[4994]: E0310 00:09:22.580107 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:22 crc kubenswrapper[4994]: E0310 00:09:22.580242 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:22 crc kubenswrapper[4994]: E0310 00:09:22.580337 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:23 crc kubenswrapper[4994]: I0310 00:09:23.492498 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/1.log" Mar 10 00:09:24 crc kubenswrapper[4994]: I0310 00:09:24.553730 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:24 crc kubenswrapper[4994]: I0310 00:09:24.553733 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:24 crc kubenswrapper[4994]: I0310 00:09:24.553806 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:24 crc kubenswrapper[4994]: I0310 00:09:24.554317 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:24 crc kubenswrapper[4994]: E0310 00:09:24.554506 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:24 crc kubenswrapper[4994]: E0310 00:09:24.554623 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:24 crc kubenswrapper[4994]: E0310 00:09:24.554750 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:24 crc kubenswrapper[4994]: I0310 00:09:24.555019 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:09:24 crc kubenswrapper[4994]: E0310 00:09:24.555030 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:24 crc kubenswrapper[4994]: E0310 00:09:24.555304 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ns797_openshift-ovn-kubernetes(72a13a81-4c11-4529-8a3d-2dd3c73215a7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" Mar 10 00:09:26 crc kubenswrapper[4994]: I0310 00:09:26.553076 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:26 crc kubenswrapper[4994]: I0310 00:09:26.553133 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:26 crc kubenswrapper[4994]: E0310 00:09:26.554501 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:26 crc kubenswrapper[4994]: I0310 00:09:26.554539 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:26 crc kubenswrapper[4994]: I0310 00:09:26.554618 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:26 crc kubenswrapper[4994]: E0310 00:09:26.554756 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:26 crc kubenswrapper[4994]: E0310 00:09:26.554937 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:26 crc kubenswrapper[4994]: E0310 00:09:26.555043 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:26 crc kubenswrapper[4994]: E0310 00:09:26.678522 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:09:28 crc kubenswrapper[4994]: I0310 00:09:28.553427 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:28 crc kubenswrapper[4994]: I0310 00:09:28.553481 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:28 crc kubenswrapper[4994]: I0310 00:09:28.553549 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:28 crc kubenswrapper[4994]: E0310 00:09:28.553659 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:28 crc kubenswrapper[4994]: I0310 00:09:28.553710 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:28 crc kubenswrapper[4994]: E0310 00:09:28.553854 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:28 crc kubenswrapper[4994]: E0310 00:09:28.554019 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:28 crc kubenswrapper[4994]: E0310 00:09:28.554196 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:30 crc kubenswrapper[4994]: I0310 00:09:30.553772 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:30 crc kubenswrapper[4994]: I0310 00:09:30.553835 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:30 crc kubenswrapper[4994]: E0310 00:09:30.553981 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:30 crc kubenswrapper[4994]: I0310 00:09:30.554040 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:30 crc kubenswrapper[4994]: I0310 00:09:30.554083 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:30 crc kubenswrapper[4994]: E0310 00:09:30.554210 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:30 crc kubenswrapper[4994]: E0310 00:09:30.554333 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:30 crc kubenswrapper[4994]: E0310 00:09:30.554424 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:31 crc kubenswrapper[4994]: E0310 00:09:31.680128 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:09:32 crc kubenswrapper[4994]: I0310 00:09:32.553547 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:32 crc kubenswrapper[4994]: E0310 00:09:32.553777 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:32 crc kubenswrapper[4994]: I0310 00:09:32.554142 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:32 crc kubenswrapper[4994]: I0310 00:09:32.554189 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:32 crc kubenswrapper[4994]: E0310 00:09:32.554315 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:32 crc kubenswrapper[4994]: I0310 00:09:32.554346 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:32 crc kubenswrapper[4994]: E0310 00:09:32.554467 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:32 crc kubenswrapper[4994]: E0310 00:09:32.554601 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:34 crc kubenswrapper[4994]: I0310 00:09:34.553136 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:34 crc kubenswrapper[4994]: I0310 00:09:34.553198 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:34 crc kubenswrapper[4994]: I0310 00:09:34.553301 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:34 crc kubenswrapper[4994]: E0310 00:09:34.553458 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:34 crc kubenswrapper[4994]: I0310 00:09:34.553492 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:34 crc kubenswrapper[4994]: E0310 00:09:34.553686 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:34 crc kubenswrapper[4994]: E0310 00:09:34.553722 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:34 crc kubenswrapper[4994]: E0310 00:09:34.553807 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:36 crc kubenswrapper[4994]: I0310 00:09:36.553330 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:36 crc kubenswrapper[4994]: I0310 00:09:36.553443 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:36 crc kubenswrapper[4994]: E0310 00:09:36.555359 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:36 crc kubenswrapper[4994]: I0310 00:09:36.555394 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:36 crc kubenswrapper[4994]: E0310 00:09:36.555532 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:36 crc kubenswrapper[4994]: I0310 00:09:36.555585 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:36 crc kubenswrapper[4994]: E0310 00:09:36.555669 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:36 crc kubenswrapper[4994]: E0310 00:09:36.555756 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:36 crc kubenswrapper[4994]: E0310 00:09:36.680695 4994 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:09:37 crc kubenswrapper[4994]: I0310 00:09:37.554669 4994 scope.go:117] "RemoveContainer" containerID="04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106" Mar 10 00:09:37 crc kubenswrapper[4994]: I0310 00:09:37.555250 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.553778 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.553815 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:38 crc kubenswrapper[4994]: E0310 00:09:38.554529 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.553987 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.553812 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:38 crc kubenswrapper[4994]: E0310 00:09:38.554687 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:38 crc kubenswrapper[4994]: E0310 00:09:38.554979 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:38 crc kubenswrapper[4994]: E0310 00:09:38.555039 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.564448 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/1.log" Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.564613 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mcxcb" event={"ID":"6dac87a5-07eb-488d-85fe-cb8848434ae5","Type":"ContainerStarted","Data":"d5d956a023e0ec491c4113a78f5143267fd1a96d627be08eb78b078d22c69f89"} Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.569080 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/3.log" Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.573426 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerStarted","Data":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.575088 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.587511 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vxjt2"] Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.587714 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:38 crc kubenswrapper[4994]: E0310 00:09:38.587990 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:38 crc kubenswrapper[4994]: I0310 00:09:38.638953 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podStartSLOduration=147.638933044 podStartE2EDuration="2m27.638933044s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:38.638329131 +0000 UTC m=+192.812035890" watchObservedRunningTime="2026-03-10 00:09:38.638933044 +0000 UTC m=+192.812639803" Mar 10 00:09:40 crc kubenswrapper[4994]: I0310 00:09:40.553574 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:40 crc kubenswrapper[4994]: E0310 00:09:40.554205 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:09:40 crc kubenswrapper[4994]: I0310 00:09:40.553574 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:40 crc kubenswrapper[4994]: I0310 00:09:40.553645 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:40 crc kubenswrapper[4994]: I0310 00:09:40.553632 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:40 crc kubenswrapper[4994]: E0310 00:09:40.554532 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxjt2" podUID="f4c125b3-4a9c-46a7-a468-54e93c44751d" Mar 10 00:09:40 crc kubenswrapper[4994]: E0310 00:09:40.554393 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:09:40 crc kubenswrapper[4994]: E0310 00:09:40.554804 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.553989 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.554003 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.554216 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.554331 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.558430 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.558624 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.558779 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.558805 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.560324 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 00:09:42 crc kubenswrapper[4994]: I0310 00:09:42.560800 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 00:09:48 crc kubenswrapper[4994]: I0310 00:09:48.924008 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.401414 4994 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.454924 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.456848 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.460845 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xm7j6"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.464643 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.469056 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.469093 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.469508 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.469199 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.469680 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.469200 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.473040 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.473387 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-725jp"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.474082 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.475810 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fxpkq"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.476343 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.477696 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.477911 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pwpc6"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.482991 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lxxqb"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.483520 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-m6jnx"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.483689 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: W0310 00:09:49.484258 4994 reflector.go:561] object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj": failed to list *v1.Secret: secrets "authentication-operator-dockercfg-mz9bj" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Mar 10 00:09:49 crc kubenswrapper[4994]: E0310 00:09:49.484311 4994 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"authentication-operator-dockercfg-mz9bj\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"authentication-operator-dockercfg-mz9bj\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.484495 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.484953 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.485076 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.485267 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mzg85"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.485385 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.485626 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.485717 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.485863 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.486064 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.486150 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.485687 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.486324 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.486785 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.486924 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.487554 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.488038 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.488095 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.488270 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.488386 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.489258 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.489494 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.488495 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.488779 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.488629 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.491795 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.492000 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.488775 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.492347 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.500809 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.505567 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.506026 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.506205 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.506341 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.506570 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.506664 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.507132 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.508510 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.509261 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.516832 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.517358 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.517838 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.518216 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.518409 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.518618 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.518983 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.520204 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.520496 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.520650 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.520832 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.521255 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.535118 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fxpkq"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.535173 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.536203 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.546642 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-rlqtz"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.547191 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.548431 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.548661 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.548839 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.567406 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.567663 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.568434 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.568627 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.570918 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.572270 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.572833 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-8lrmb"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.573185 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hqlnc"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.573514 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29551680-sz8pz"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.573978 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.574045 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.574394 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.574677 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.574702 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.574947 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575184 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.576213 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.574948 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.583843 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584201 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-etcd-serving-ca\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584251 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-serving-cert\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584281 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-config\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584309 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06c3fca2-64b6-47e2-885f-948eac331c10-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mpvlf\" (UID: \"06c3fca2-64b6-47e2-885f-948eac331c10\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584337 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwggf\" (UniqueName: \"kubernetes.io/projected/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-kube-api-access-jwggf\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584367 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5937dfbb-0da7-439c-94cb-e0e1f658d464-audit-dir\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584414 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thbrs\" (UniqueName: \"kubernetes.io/projected/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-kube-api-access-thbrs\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584444 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584472 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4c9cbda0-655c-4cf9-8f9a-23b3ebf37339-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mzg85\" (UID: \"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584505 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-encryption-config\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584531 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/903778b5-0c60-42d6-8773-a1345817fe1f-audit-dir\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584560 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584592 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584620 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs6gr\" (UniqueName: \"kubernetes.io/projected/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-kube-api-access-rs6gr\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584644 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-75h8c"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575001 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584730 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575057 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.585147 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575178 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.585274 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-x6s5d"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575334 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575409 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.585603 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.585671 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575468 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575559 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.586145 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.586180 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.586304 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575585 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.575708 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.576960 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.582084 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.582485 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.582535 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.582583 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.582732 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.582843 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.582988 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.586381 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.587066 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.587566 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.587958 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.588195 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.588378 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.588616 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.588753 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.584650 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-config\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.598698 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l78l\" (UniqueName: \"kubernetes.io/projected/9a1c67e3-f6df-4b4d-b3a3-669503580446-kube-api-access-9l78l\") pod \"cluster-samples-operator-665b6dd947-87hn7\" (UID: \"9a1c67e3-f6df-4b4d-b3a3-669503580446\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.598755 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.598792 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ed61f01-8d13-4883-ac58-0e998df5c20d-config\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.598815 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-serving-cert\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.598845 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fl9d\" (UniqueName: \"kubernetes.io/projected/3a5ced5c-b690-4a1c-8d48-bbf789366816-kube-api-access-4fl9d\") pod \"openshift-apiserver-operator-796bbdcf4f-lgqdf\" (UID: \"3a5ced5c-b690-4a1c-8d48-bbf789366816\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.598870 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s798j\" (UniqueName: \"kubernetes.io/projected/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-kube-api-access-s798j\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.600834 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.586655 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.600915 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.600945 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-config\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.601036 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5937dfbb-0da7-439c-94cb-e0e1f658d464-encryption-config\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.601062 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a1c67e3-f6df-4b4d-b3a3-669503580446-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-87hn7\" (UID: \"9a1c67e3-f6df-4b4d-b3a3-669503580446\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.601104 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.601127 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-config\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.601148 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-service-ca-bundle\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.601171 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a5ced5c-b690-4a1c-8d48-bbf789366816-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lgqdf\" (UID: \"3a5ced5c-b690-4a1c-8d48-bbf789366816\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.601209 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxm5s\" (UniqueName: \"kubernetes.io/projected/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-kube-api-access-rxm5s\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.601257 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-config\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.586683 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.588940 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.593960 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.586932 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.589227 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.592398 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.593480 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.603499 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.594086 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.594243 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.594389 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.594434 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.603794 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.594557 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.594599 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.594638 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.597613 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.602056 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608384 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmfx4\" (UniqueName: \"kubernetes.io/projected/1109e060-ef32-407d-8283-eba65e1d4eaa-kube-api-access-wmfx4\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608419 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608465 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-audit-dir\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608512 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d8f6\" (UniqueName: \"kubernetes.io/projected/6ed61f01-8d13-4883-ac58-0e998df5c20d-kube-api-access-5d8f6\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608539 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-serving-cert\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608569 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-client-ca\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608613 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608689 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-audit-policies\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608717 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-serving-cert\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608744 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5937dfbb-0da7-439c-94cb-e0e1f658d464-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608773 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c9cbda0-655c-4cf9-8f9a-23b3ebf37339-serving-cert\") pod \"openshift-config-operator-7777fb866f-mzg85\" (UID: \"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608808 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608835 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608838 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5937dfbb-0da7-439c-94cb-e0e1f658d464-audit-policies\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.608949 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609127 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5937dfbb-0da7-439c-94cb-e0e1f658d464-etcd-client\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609194 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ed61f01-8d13-4883-ac58-0e998df5c20d-trusted-ca\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609225 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpvwk\" (UniqueName: \"kubernetes.io/projected/06c3fca2-64b6-47e2-885f-948eac331c10-kube-api-access-lpvwk\") pod \"openshift-controller-manager-operator-756b6f6bc6-mpvlf\" (UID: \"06c3fca2-64b6-47e2-885f-948eac331c10\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609251 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609265 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-client-ca\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609296 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1109e060-ef32-407d-8283-eba65e1d4eaa-machine-approver-tls\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.602454 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609350 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06c3fca2-64b6-47e2-885f-948eac331c10-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mpvlf\" (UID: \"06c3fca2-64b6-47e2-885f-948eac331c10\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609383 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937dfbb-0da7-439c-94cb-e0e1f658d464-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609514 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609527 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609662 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48dxj\" (UniqueName: \"kubernetes.io/projected/903778b5-0c60-42d6-8773-a1345817fe1f-kube-api-access-48dxj\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609693 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1109e060-ef32-407d-8283-eba65e1d4eaa-config\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609719 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-node-pullsecrets\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609747 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-etcd-client\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609754 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609799 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-image-import-ca\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609837 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.609985 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.610025 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npjwm\" (UniqueName: \"kubernetes.io/projected/4c9cbda0-655c-4cf9-8f9a-23b3ebf37339-kube-api-access-npjwm\") pod \"openshift-config-operator-7777fb866f-mzg85\" (UID: \"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.610075 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2qtg\" (UniqueName: \"kubernetes.io/projected/5937dfbb-0da7-439c-94cb-e0e1f658d464-kube-api-access-w2qtg\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.610160 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.610304 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.610383 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-audit\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.610457 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5937dfbb-0da7-439c-94cb-e0e1f658d464-serving-cert\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.610540 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.614221 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.610628 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a5ced5c-b690-4a1c-8d48-bbf789366816-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lgqdf\" (UID: \"3a5ced5c-b690-4a1c-8d48-bbf789366816\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.615576 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1109e060-ef32-407d-8283-eba65e1d4eaa-auth-proxy-config\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.615643 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ed61f01-8d13-4883-ac58-0e998df5c20d-serving-cert\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.615680 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-images\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.616352 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.620121 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.620836 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.621136 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.621344 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.624332 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6sx4w"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.624776 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.628191 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.629029 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-pvfj5"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.634041 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.634500 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vh5ns"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.635336 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.635964 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tgf68"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.636738 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.637584 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.638508 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.639276 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.639862 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.641120 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-966nr"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.642060 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.642740 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551688-9zsf6"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.643649 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.643862 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.644410 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551688-9zsf6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.645959 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.646294 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.646319 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.646693 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.647401 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-m6jnx"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.647424 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pwpc6"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.647436 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wv4d4"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.647910 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.649705 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.649746 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-725jp"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.649840 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.651702 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xm7j6"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.653083 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.654135 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.655228 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.656345 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.657482 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.660372 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.661039 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29551680-sz8pz"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.662123 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8lrmb"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.670501 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.672133 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-pvfj5"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.672661 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.679375 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.679809 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.681018 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rlqtz"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.682097 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.683750 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.684745 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lxxqb"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.685791 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mzg85"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.687115 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hqlnc"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.689700 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.690598 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.692521 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wv4d4"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.694030 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-75h8c"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.695043 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.696027 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5hvbc"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.696943 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5hvbc" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.697422 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.698501 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.699471 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.699931 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551688-9zsf6"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.702268 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-966nr"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.707785 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vh5ns"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.711507 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.716092 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tgf68"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717051 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717108 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-encryption-config\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717139 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dbc1f03-d386-460b-81f5-e6b7d3630557-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2m4gh\" (UID: \"4dbc1f03-d386-460b-81f5-e6b7d3630557\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717173 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/903778b5-0c60-42d6-8773-a1345817fe1f-audit-dir\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717194 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717217 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717235 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs6gr\" (UniqueName: \"kubernetes.io/projected/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-kube-api-access-rs6gr\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717304 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-config\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717324 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l78l\" (UniqueName: \"kubernetes.io/projected/9a1c67e3-f6df-4b4d-b3a3-669503580446-kube-api-access-9l78l\") pod \"cluster-samples-operator-665b6dd947-87hn7\" (UID: \"9a1c67e3-f6df-4b4d-b3a3-669503580446\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717344 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qdfc\" (UniqueName: \"kubernetes.io/projected/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-kube-api-access-6qdfc\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717365 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/018c45cc-8cfa-497b-b6cf-25b10c694c58-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717382 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2204937d-9632-46e6-8f26-0cea8593d1a5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cvds8\" (UID: \"2204937d-9632-46e6-8f26-0cea8593d1a5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717401 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717418 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ed61f01-8d13-4883-ac58-0e998df5c20d-config\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717435 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-serving-cert\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717452 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fl9d\" (UniqueName: \"kubernetes.io/projected/3a5ced5c-b690-4a1c-8d48-bbf789366816-kube-api-access-4fl9d\") pod \"openshift-apiserver-operator-796bbdcf4f-lgqdf\" (UID: \"3a5ced5c-b690-4a1c-8d48-bbf789366816\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717474 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-config\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717494 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s798j\" (UniqueName: \"kubernetes.io/projected/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-kube-api-access-s798j\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717511 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717537 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717555 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5937dfbb-0da7-439c-94cb-e0e1f658d464-encryption-config\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717574 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-service-ca-bundle\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717621 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a1c67e3-f6df-4b4d-b3a3-669503580446-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-87hn7\" (UID: \"9a1c67e3-f6df-4b4d-b3a3-669503580446\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717653 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/018c45cc-8cfa-497b-b6cf-25b10c694c58-metrics-tls\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717689 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717710 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-config\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717736 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4dbc1f03-d386-460b-81f5-e6b7d3630557-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2m4gh\" (UID: \"4dbc1f03-d386-460b-81f5-e6b7d3630557\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717758 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxm5s\" (UniqueName: \"kubernetes.io/projected/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-kube-api-access-rxm5s\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717776 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a5ced5c-b690-4a1c-8d48-bbf789366816-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lgqdf\" (UID: \"3a5ced5c-b690-4a1c-8d48-bbf789366816\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717795 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvvnx\" (UniqueName: \"kubernetes.io/projected/11b78073-cc4a-4a6f-89ab-631fde4b3371-kube-api-access-gvvnx\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717818 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29f559d0-b505-4855-91e3-e46804b0c9f1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4xmwf\" (UID: \"29f559d0-b505-4855-91e3-e46804b0c9f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717838 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-config\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717858 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmfx4\" (UniqueName: \"kubernetes.io/projected/1109e060-ef32-407d-8283-eba65e1d4eaa-kube-api-access-wmfx4\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717877 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-serving-cert\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717912 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-audit-dir\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717930 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7jt5\" (UniqueName: \"kubernetes.io/projected/018c45cc-8cfa-497b-b6cf-25b10c694c58-kube-api-access-b7jt5\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717954 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d8f6\" (UniqueName: \"kubernetes.io/projected/6ed61f01-8d13-4883-ac58-0e998df5c20d-kube-api-access-5d8f6\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717973 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717993 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718011 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-client-ca\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718032 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-webhook-cert\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718054 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/018c45cc-8cfa-497b-b6cf-25b10c694c58-trusted-ca\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718072 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-serving-cert\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718091 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq5dn\" (UniqueName: \"kubernetes.io/projected/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-kube-api-access-hq5dn\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718107 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718125 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-audit-policies\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718144 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5937dfbb-0da7-439c-94cb-e0e1f658d464-etcd-client\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718164 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5937dfbb-0da7-439c-94cb-e0e1f658d464-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718181 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c9cbda0-655c-4cf9-8f9a-23b3ebf37339-serving-cert\") pod \"openshift-config-operator-7777fb866f-mzg85\" (UID: \"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718201 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718218 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5937dfbb-0da7-439c-94cb-e0e1f658d464-audit-policies\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718235 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-trusted-ca-bundle\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718240 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-config\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718256 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6snqr\" (UniqueName: \"kubernetes.io/projected/f15954a6-2036-4c32-a8b6-bc8e227d0fcd-kube-api-access-6snqr\") pod \"control-plane-machine-set-operator-78cbb6b69f-vjj5j\" (UID: \"f15954a6-2036-4c32-a8b6-bc8e227d0fcd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718276 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718295 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ed61f01-8d13-4883-ac58-0e998df5c20d-trusted-ca\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718321 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpvwk\" (UniqueName: \"kubernetes.io/projected/06c3fca2-64b6-47e2-885f-948eac331c10-kube-api-access-lpvwk\") pod \"openshift-controller-manager-operator-756b6f6bc6-mpvlf\" (UID: \"06c3fca2-64b6-47e2-885f-948eac331c10\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718347 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7278l\" (UniqueName: \"kubernetes.io/projected/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-kube-api-access-7278l\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718371 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-client-ca\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718390 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f15954a6-2036-4c32-a8b6-bc8e227d0fcd-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vjj5j\" (UID: \"f15954a6-2036-4c32-a8b6-bc8e227d0fcd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718417 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937dfbb-0da7-439c-94cb-e0e1f658d464-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718433 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1109e060-ef32-407d-8283-eba65e1d4eaa-machine-approver-tls\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718450 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-apiservice-cert\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718468 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06c3fca2-64b6-47e2-885f-948eac331c10-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mpvlf\" (UID: \"06c3fca2-64b6-47e2-885f-948eac331c10\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718488 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-console-config\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718506 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-service-ca\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718528 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718545 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48dxj\" (UniqueName: \"kubernetes.io/projected/903778b5-0c60-42d6-8773-a1345817fe1f-kube-api-access-48dxj\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718563 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1109e060-ef32-407d-8283-eba65e1d4eaa-config\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718587 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-node-pullsecrets\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718645 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-etcd-client\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718676 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-image-import-ca\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718710 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718718 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718737 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718769 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npjwm\" (UniqueName: \"kubernetes.io/projected/4c9cbda0-655c-4cf9-8f9a-23b3ebf37339-kube-api-access-npjwm\") pod \"openshift-config-operator-7777fb866f-mzg85\" (UID: \"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718792 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1377f73a-df08-4450-afa1-960e15891141-config-volume\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718814 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxzm2\" (UniqueName: \"kubernetes.io/projected/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-kube-api-access-qxzm2\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718836 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-srv-cert\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718861 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2qtg\" (UniqueName: \"kubernetes.io/projected/5937dfbb-0da7-439c-94cb-e0e1f658d464-kube-api-access-w2qtg\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718943 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5937dfbb-0da7-439c-94cb-e0e1f658d464-serving-cert\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718967 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.718990 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719012 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-audit\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719040 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/11b78073-cc4a-4a6f-89ab-631fde4b3371-console-serving-cert\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719063 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-oauth-serving-cert\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719087 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgr88\" (UniqueName: \"kubernetes.io/projected/1377f73a-df08-4450-afa1-960e15891141-kube-api-access-dgr88\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719115 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719142 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a5ced5c-b690-4a1c-8d48-bbf789366816-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lgqdf\" (UID: \"3a5ced5c-b690-4a1c-8d48-bbf789366816\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719165 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1109e060-ef32-407d-8283-eba65e1d4eaa-auth-proxy-config\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719190 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhnvv\" (UniqueName: \"kubernetes.io/projected/3ffed56d-e2ab-4fa9-9dac-98c382395f2f-kube-api-access-lhnvv\") pod \"migrator-59844c95c7-fkpf5\" (UID: \"3ffed56d-e2ab-4fa9-9dac-98c382395f2f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719221 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29f559d0-b505-4855-91e3-e46804b0c9f1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4xmwf\" (UID: \"29f559d0-b505-4855-91e3-e46804b0c9f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719248 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ed61f01-8d13-4883-ac58-0e998df5c20d-serving-cert\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719271 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-images\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719293 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-tmpfs\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719320 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-config\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719347 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-etcd-serving-ca\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719370 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1377f73a-df08-4450-afa1-960e15891141-metrics-tls\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719395 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-serving-cert\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719419 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06c3fca2-64b6-47e2-885f-948eac331c10-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mpvlf\" (UID: \"06c3fca2-64b6-47e2-885f-948eac331c10\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719443 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pv9b\" (UniqueName: \"kubernetes.io/projected/49f58ba5-3573-4894-a320-fcf4ca4e50f1-kube-api-access-4pv9b\") pod \"catalog-operator-68c6474976-bkq7b\" (UID: \"49f58ba5-3573-4894-a320-fcf4ca4e50f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719465 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719498 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thbrs\" (UniqueName: \"kubernetes.io/projected/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-kube-api-access-thbrs\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719531 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwggf\" (UniqueName: \"kubernetes.io/projected/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-kube-api-access-jwggf\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719554 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5937dfbb-0da7-439c-94cb-e0e1f658d464-audit-dir\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719590 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dbc1f03-d386-460b-81f5-e6b7d3630557-config\") pod \"kube-controller-manager-operator-78b949d7b-2m4gh\" (UID: \"4dbc1f03-d386-460b-81f5-e6b7d3630557\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719613 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49f58ba5-3573-4894-a320-fcf4ca4e50f1-srv-cert\") pod \"catalog-operator-68c6474976-bkq7b\" (UID: \"49f58ba5-3573-4894-a320-fcf4ca4e50f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719642 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49f58ba5-3573-4894-a320-fcf4ca4e50f1-profile-collector-cert\") pod \"catalog-operator-68c6474976-bkq7b\" (UID: \"49f58ba5-3573-4894-a320-fcf4ca4e50f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719676 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv5f5\" (UniqueName: \"kubernetes.io/projected/2204937d-9632-46e6-8f26-0cea8593d1a5-kube-api-access-mv5f5\") pod \"package-server-manager-789f6589d5-cvds8\" (UID: \"2204937d-9632-46e6-8f26-0cea8593d1a5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719709 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719734 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4c9cbda0-655c-4cf9-8f9a-23b3ebf37339-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mzg85\" (UID: \"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719762 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-proxy-tls\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719767 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6sx4w"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719786 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/11b78073-cc4a-4a6f-89ab-631fde4b3371-console-oauth-config\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.719903 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29f559d0-b505-4855-91e3-e46804b0c9f1-config\") pod \"kube-apiserver-operator-766d6c64bb-4xmwf\" (UID: \"29f559d0-b505-4855-91e3-e46804b0c9f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.721163 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.717700 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/903778b5-0c60-42d6-8773-a1345817fe1f-audit-dir\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.721655 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937dfbb-0da7-439c-94cb-e0e1f658d464-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.721849 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.722422 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ed61f01-8d13-4883-ac58-0e998df5c20d-config\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.722498 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1109e060-ef32-407d-8283-eba65e1d4eaa-config\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.722584 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-node-pullsecrets\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.722924 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ed61f01-8d13-4883-ac58-0e998df5c20d-trusted-ca\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.724264 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-client-ca\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.727518 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-etcd-client\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.728601 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5937dfbb-0da7-439c-94cb-e0e1f658d464-audit-dir\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.727839 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06c3fca2-64b6-47e2-885f-948eac331c10-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mpvlf\" (UID: \"06c3fca2-64b6-47e2-885f-948eac331c10\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.727723 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5hvbc"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.729125 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-audit-dir\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.730236 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c9cbda0-655c-4cf9-8f9a-23b3ebf37339-serving-cert\") pod \"openshift-config-operator-7777fb866f-mzg85\" (UID: \"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.730337 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a5ced5c-b690-4a1c-8d48-bbf789366816-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lgqdf\" (UID: \"3a5ced5c-b690-4a1c-8d48-bbf789366816\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.730785 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06c3fca2-64b6-47e2-885f-948eac331c10-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mpvlf\" (UID: \"06c3fca2-64b6-47e2-885f-948eac331c10\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.731061 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-config\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.731219 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5937dfbb-0da7-439c-94cb-e0e1f658d464-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.731969 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1109e060-ef32-407d-8283-eba65e1d4eaa-machine-approver-tls\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.733519 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.733818 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4c9cbda0-655c-4cf9-8f9a-23b3ebf37339-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mzg85\" (UID: \"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.733958 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-etcd-serving-ca\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.734169 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.734273 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.734274 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-client-ca\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.734458 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1109e060-ef32-407d-8283-eba65e1d4eaa-auth-proxy-config\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.735263 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-config\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.735480 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5937dfbb-0da7-439c-94cb-e0e1f658d464-encryption-config\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.735566 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-config\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.735603 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-audit-policies\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.735639 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5937dfbb-0da7-439c-94cb-e0e1f658d464-audit-policies\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.735638 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-service-ca-bundle\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.735647 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.736065 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a5ced5c-b690-4a1c-8d48-bbf789366816-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lgqdf\" (UID: \"3a5ced5c-b690-4a1c-8d48-bbf789366816\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.736248 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.736375 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5937dfbb-0da7-439c-94cb-e0e1f658d464-etcd-client\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.736554 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-images\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.736778 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-image-import-ca\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.737127 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-serving-cert\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.737202 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.737125 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-audit\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.737317 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.737963 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-config\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.738433 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5937dfbb-0da7-439c-94cb-e0e1f658d464-serving-cert\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.739454 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.739912 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.740570 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a1c67e3-f6df-4b4d-b3a3-669503580446-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-87hn7\" (UID: \"9a1c67e3-f6df-4b4d-b3a3-669503580446\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.743511 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.743686 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.743890 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-serving-cert\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.744015 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ed61f01-8d13-4883-ac58-0e998df5c20d-serving-cert\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.744077 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-serving-cert\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.744922 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.747063 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cf2xx"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.747691 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.747990 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-serving-cert\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.748161 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-encryption-config\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.749465 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.751002 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.752917 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.755185 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-47fkz"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.756169 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cf2xx"] Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.756279 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.761849 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.779420 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.800202 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.820209 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821186 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/11b78073-cc4a-4a6f-89ab-631fde4b3371-console-oauth-config\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821261 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29f559d0-b505-4855-91e3-e46804b0c9f1-config\") pod \"kube-apiserver-operator-766d6c64bb-4xmwf\" (UID: \"29f559d0-b505-4855-91e3-e46804b0c9f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821324 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dbc1f03-d386-460b-81f5-e6b7d3630557-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2m4gh\" (UID: \"4dbc1f03-d386-460b-81f5-e6b7d3630557\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821413 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qdfc\" (UniqueName: \"kubernetes.io/projected/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-kube-api-access-6qdfc\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821480 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821511 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/018c45cc-8cfa-497b-b6cf-25b10c694c58-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821582 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2204937d-9632-46e6-8f26-0cea8593d1a5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cvds8\" (UID: \"2204937d-9632-46e6-8f26-0cea8593d1a5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821660 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/018c45cc-8cfa-497b-b6cf-25b10c694c58-metrics-tls\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821698 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4dbc1f03-d386-460b-81f5-e6b7d3630557-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2m4gh\" (UID: \"4dbc1f03-d386-460b-81f5-e6b7d3630557\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821744 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvvnx\" (UniqueName: \"kubernetes.io/projected/11b78073-cc4a-4a6f-89ab-631fde4b3371-kube-api-access-gvvnx\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821773 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29f559d0-b505-4855-91e3-e46804b0c9f1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4xmwf\" (UID: \"29f559d0-b505-4855-91e3-e46804b0c9f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821828 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7jt5\" (UniqueName: \"kubernetes.io/projected/018c45cc-8cfa-497b-b6cf-25b10c694c58-kube-api-access-b7jt5\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.821945 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822004 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-webhook-cert\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822026 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/018c45cc-8cfa-497b-b6cf-25b10c694c58-trusted-ca\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822087 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq5dn\" (UniqueName: \"kubernetes.io/projected/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-kube-api-access-hq5dn\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822134 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-trusted-ca-bundle\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822163 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6snqr\" (UniqueName: \"kubernetes.io/projected/f15954a6-2036-4c32-a8b6-bc8e227d0fcd-kube-api-access-6snqr\") pod \"control-plane-machine-set-operator-78cbb6b69f-vjj5j\" (UID: \"f15954a6-2036-4c32-a8b6-bc8e227d0fcd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822209 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822252 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7278l\" (UniqueName: \"kubernetes.io/projected/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-kube-api-access-7278l\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822350 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f15954a6-2036-4c32-a8b6-bc8e227d0fcd-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vjj5j\" (UID: \"f15954a6-2036-4c32-a8b6-bc8e227d0fcd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822412 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-apiservice-cert\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822442 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-console-config\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822492 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-service-ca\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822549 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1377f73a-df08-4450-afa1-960e15891141-config-volume\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822582 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxzm2\" (UniqueName: \"kubernetes.io/projected/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-kube-api-access-qxzm2\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822609 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-srv-cert\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822635 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhnvv\" (UniqueName: \"kubernetes.io/projected/3ffed56d-e2ab-4fa9-9dac-98c382395f2f-kube-api-access-lhnvv\") pod \"migrator-59844c95c7-fkpf5\" (UID: \"3ffed56d-e2ab-4fa9-9dac-98c382395f2f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822656 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/11b78073-cc4a-4a6f-89ab-631fde4b3371-console-serving-cert\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822680 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-oauth-serving-cert\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822703 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgr88\" (UniqueName: \"kubernetes.io/projected/1377f73a-df08-4450-afa1-960e15891141-kube-api-access-dgr88\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822729 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-tmpfs\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822758 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29f559d0-b505-4855-91e3-e46804b0c9f1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4xmwf\" (UID: \"29f559d0-b505-4855-91e3-e46804b0c9f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822784 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1377f73a-df08-4450-afa1-960e15891141-metrics-tls\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822811 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pv9b\" (UniqueName: \"kubernetes.io/projected/49f58ba5-3573-4894-a320-fcf4ca4e50f1-kube-api-access-4pv9b\") pod \"catalog-operator-68c6474976-bkq7b\" (UID: \"49f58ba5-3573-4894-a320-fcf4ca4e50f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822831 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822924 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dbc1f03-d386-460b-81f5-e6b7d3630557-config\") pod \"kube-controller-manager-operator-78b949d7b-2m4gh\" (UID: \"4dbc1f03-d386-460b-81f5-e6b7d3630557\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822950 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49f58ba5-3573-4894-a320-fcf4ca4e50f1-srv-cert\") pod \"catalog-operator-68c6474976-bkq7b\" (UID: \"49f58ba5-3573-4894-a320-fcf4ca4e50f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822975 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49f58ba5-3573-4894-a320-fcf4ca4e50f1-profile-collector-cert\") pod \"catalog-operator-68c6474976-bkq7b\" (UID: \"49f58ba5-3573-4894-a320-fcf4ca4e50f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.822999 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv5f5\" (UniqueName: \"kubernetes.io/projected/2204937d-9632-46e6-8f26-0cea8593d1a5-kube-api-access-mv5f5\") pod \"package-server-manager-789f6589d5-cvds8\" (UID: \"2204937d-9632-46e6-8f26-0cea8593d1a5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.823032 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-proxy-tls\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.823563 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-tmpfs\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.823625 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-trusted-ca-bundle\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.823994 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.824277 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-console-config\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.824739 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-oauth-serving-cert\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.825419 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/11b78073-cc4a-4a6f-89ab-631fde4b3371-service-ca\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.826568 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/11b78073-cc4a-4a6f-89ab-631fde4b3371-console-oauth-config\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.826594 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-apiservice-cert\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.827651 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-webhook-cert\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.829100 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/11b78073-cc4a-4a6f-89ab-631fde4b3371-console-serving-cert\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.839708 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.849196 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/018c45cc-8cfa-497b-b6cf-25b10c694c58-metrics-tls\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.860060 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.887007 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.893664 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/018c45cc-8cfa-497b-b6cf-25b10c694c58-trusted-ca\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.899322 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.918906 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.940090 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.958966 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.979041 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 00:09:49 crc kubenswrapper[4994]: I0310 00:09:49.999840 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.020392 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.039877 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.059505 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.079639 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.099170 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.118761 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.140666 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.160757 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.179694 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.200054 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.220479 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.240221 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.247263 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dbc1f03-d386-460b-81f5-e6b7d3630557-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2m4gh\" (UID: \"4dbc1f03-d386-460b-81f5-e6b7d3630557\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.260037 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.265750 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dbc1f03-d386-460b-81f5-e6b7d3630557-config\") pod \"kube-controller-manager-operator-78b949d7b-2m4gh\" (UID: \"4dbc1f03-d386-460b-81f5-e6b7d3630557\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.279547 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.288194 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2204937d-9632-46e6-8f26-0cea8593d1a5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cvds8\" (UID: \"2204937d-9632-46e6-8f26-0cea8593d1a5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.299453 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.320654 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.340726 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.360161 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.380785 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.400035 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.408614 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.410285 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49f58ba5-3573-4894-a320-fcf4ca4e50f1-profile-collector-cert\") pod \"catalog-operator-68c6474976-bkq7b\" (UID: \"49f58ba5-3573-4894-a320-fcf4ca4e50f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.439856 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.461707 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.480604 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.500086 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.521186 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.529150 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29f559d0-b505-4855-91e3-e46804b0c9f1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4xmwf\" (UID: \"29f559d0-b505-4855-91e3-e46804b0c9f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.540254 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.542643 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29f559d0-b505-4855-91e3-e46804b0c9f1-config\") pod \"kube-apiserver-operator-766d6c64bb-4xmwf\" (UID: \"29f559d0-b505-4855-91e3-e46804b0c9f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.559061 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.569193 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49f58ba5-3573-4894-a320-fcf4ca4e50f1-srv-cert\") pod \"catalog-operator-68c6474976-bkq7b\" (UID: \"49f58ba5-3573-4894-a320-fcf4ca4e50f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.580069 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.600609 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.620032 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.637522 4994 request.go:700] Waited for 1.012205583s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/secrets?fieldSelector=metadata.name%3Dserving-cert&limit=500&resourceVersion=0 Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.640919 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.660594 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.681228 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.699505 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.720011 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.742649 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.759077 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.779583 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.799822 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.818939 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.822486 4994 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.822558 4994 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.822610 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-operator-metrics podName:b85bbdaa-daa8-4c69-abf9-9f1200eb07cd nodeName:}" failed. No retries permitted until 2026-03-10 00:09:51.322579533 +0000 UTC m=+205.496286322 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-operator-metrics") pod "marketplace-operator-79b997595-tgf68" (UID: "b85bbdaa-daa8-4c69-abf9-9f1200eb07cd") : failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.822638 4994 secret.go:188] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.822640 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-trusted-ca podName:b85bbdaa-daa8-4c69-abf9-9f1200eb07cd nodeName:}" failed. No retries permitted until 2026-03-10 00:09:51.322627634 +0000 UTC m=+205.496334413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-trusted-ca") pod "marketplace-operator-79b997595-tgf68" (UID: "b85bbdaa-daa8-4c69-abf9-9f1200eb07cd") : failed to sync configmap cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.822707 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f15954a6-2036-4c32-a8b6-bc8e227d0fcd-control-plane-machine-set-operator-tls podName:f15954a6-2036-4c32-a8b6-bc8e227d0fcd nodeName:}" failed. No retries permitted until 2026-03-10 00:09:51.322685405 +0000 UTC m=+205.496392184 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/f15954a6-2036-4c32-a8b6-bc8e227d0fcd-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-78cbb6b69f-vjj5j" (UID: "f15954a6-2036-4c32-a8b6-bc8e227d0fcd") : failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.823674 4994 secret.go:188] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.823741 4994 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.823776 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-proxy-tls podName:2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a nodeName:}" failed. No retries permitted until 2026-03-10 00:09:51.323751902 +0000 UTC m=+205.497458691 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-proxy-tls") pod "machine-config-controller-84d6567774-966nr" (UID: "2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a") : failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.823821 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-srv-cert podName:5cc67063-d02f-4cb9-a15d-0d0a5c457e6e nodeName:}" failed. No retries permitted until 2026-03-10 00:09:51.323797733 +0000 UTC m=+205.497504522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-srv-cert") pod "olm-operator-6b444d44fb-c4cdv" (UID: "5cc67063-d02f-4cb9-a15d-0d0a5c457e6e") : failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.823843 4994 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.823920 4994 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.823967 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1377f73a-df08-4450-afa1-960e15891141-metrics-tls podName:1377f73a-df08-4450-afa1-960e15891141 nodeName:}" failed. No retries permitted until 2026-03-10 00:09:51.323942286 +0000 UTC m=+205.497649075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1377f73a-df08-4450-afa1-960e15891141-metrics-tls") pod "dns-default-wv4d4" (UID: "1377f73a-df08-4450-afa1-960e15891141") : failed to sync secret cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: E0310 00:09:50.824086 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1377f73a-df08-4450-afa1-960e15891141-config-volume podName:1377f73a-df08-4450-afa1-960e15891141 nodeName:}" failed. No retries permitted until 2026-03-10 00:09:51.324046939 +0000 UTC m=+205.497753908 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/1377f73a-df08-4450-afa1-960e15891141-config-volume") pod "dns-default-wv4d4" (UID: "1377f73a-df08-4450-afa1-960e15891141") : failed to sync configmap cache: timed out waiting for the condition Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.840601 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.860567 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.880077 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.900972 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.921726 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.951946 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.960174 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 00:09:50 crc kubenswrapper[4994]: I0310 00:09:50.980430 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.000427 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.019844 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.039806 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.059921 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.079231 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.100722 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.120938 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.141740 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.159339 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.179432 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.201201 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.220634 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.240822 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.261705 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.279737 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.300955 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.321004 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.339086 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.358700 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1377f73a-df08-4450-afa1-960e15891141-config-volume\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.358788 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-srv-cert\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.358878 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1377f73a-df08-4450-afa1-960e15891141-metrics-tls\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.359007 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-proxy-tls\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.359078 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.359248 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.359363 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f15954a6-2036-4c32-a8b6-bc8e227d0fcd-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vjj5j\" (UID: \"f15954a6-2036-4c32-a8b6-bc8e227d0fcd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.360318 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.362144 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.364738 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1377f73a-df08-4450-afa1-960e15891141-metrics-tls\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.365414 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-srv-cert\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.367511 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-proxy-tls\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.368938 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f15954a6-2036-4c32-a8b6-bc8e227d0fcd-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-vjj5j\" (UID: \"f15954a6-2036-4c32-a8b6-bc8e227d0fcd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.370366 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.383780 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.391300 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1377f73a-df08-4450-afa1-960e15891141-config-volume\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.420422 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.439829 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.460690 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.479836 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.526906 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs6gr\" (UniqueName: \"kubernetes.io/projected/fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d-kube-api-access-rs6gr\") pod \"machine-api-operator-5694c8668f-m6jnx\" (UID: \"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.553552 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l78l\" (UniqueName: \"kubernetes.io/projected/9a1c67e3-f6df-4b4d-b3a3-669503580446-kube-api-access-9l78l\") pod \"cluster-samples-operator-665b6dd947-87hn7\" (UID: \"9a1c67e3-f6df-4b4d-b3a3-669503580446\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.571557 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48dxj\" (UniqueName: \"kubernetes.io/projected/903778b5-0c60-42d6-8773-a1345817fe1f-kube-api-access-48dxj\") pod \"oauth-openshift-558db77b4-fxpkq\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.588394 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpvwk\" (UniqueName: \"kubernetes.io/projected/06c3fca2-64b6-47e2-885f-948eac331c10-kube-api-access-lpvwk\") pod \"openshift-controller-manager-operator-756b6f6bc6-mpvlf\" (UID: \"06c3fca2-64b6-47e2-885f-948eac331c10\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.607902 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxm5s\" (UniqueName: \"kubernetes.io/projected/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-kube-api-access-rxm5s\") pod \"route-controller-manager-6576b87f9c-bkhqb\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.627537 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fl9d\" (UniqueName: \"kubernetes.io/projected/3a5ced5c-b690-4a1c-8d48-bbf789366816-kube-api-access-4fl9d\") pod \"openshift-apiserver-operator-796bbdcf4f-lgqdf\" (UID: \"3a5ced5c-b690-4a1c-8d48-bbf789366816\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.638272 4994 request.go:700] Waited for 1.909735757s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver/serviceaccounts/openshift-apiserver-sa/token Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.646325 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmfx4\" (UniqueName: \"kubernetes.io/projected/1109e060-ef32-407d-8283-eba65e1d4eaa-kube-api-access-wmfx4\") pod \"machine-approver-56656f9798-zpr72\" (UID: \"1109e060-ef32-407d-8283-eba65e1d4eaa\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.650093 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.663605 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s798j\" (UniqueName: \"kubernetes.io/projected/ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21-kube-api-access-s798j\") pod \"apiserver-76f77b778f-lxxqb\" (UID: \"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21\") " pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.681937 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thbrs\" (UniqueName: \"kubernetes.io/projected/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-kube-api-access-thbrs\") pod \"controller-manager-879f6c89f-xm7j6\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.701289 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwggf\" (UniqueName: \"kubernetes.io/projected/46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d-kube-api-access-jwggf\") pod \"authentication-operator-69f744f599-725jp\" (UID: \"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.719359 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d8f6\" (UniqueName: \"kubernetes.io/projected/6ed61f01-8d13-4883-ac58-0e998df5c20d-kube-api-access-5d8f6\") pod \"console-operator-58897d9998-pwpc6\" (UID: \"6ed61f01-8d13-4883-ac58-0e998df5c20d\") " pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.729059 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.739479 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npjwm\" (UniqueName: \"kubernetes.io/projected/4c9cbda0-655c-4cf9-8f9a-23b3ebf37339-kube-api-access-npjwm\") pod \"openshift-config-operator-7777fb866f-mzg85\" (UID: \"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.746061 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.756150 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.760827 4994 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.767387 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.769855 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2qtg\" (UniqueName: \"kubernetes.io/projected/5937dfbb-0da7-439c-94cb-e0e1f658d464-kube-api-access-w2qtg\") pod \"apiserver-7bbb656c7d-xtfzl\" (UID: \"5937dfbb-0da7-439c-94cb-e0e1f658d464\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.777650 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.779699 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.785370 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.793967 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.799863 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.800366 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.813819 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.822073 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.840077 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.859518 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.878757 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.895745 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fxpkq"] Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.900522 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4dbc1f03-d386-460b-81f5-e6b7d3630557-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2m4gh\" (UID: \"4dbc1f03-d386-460b-81f5-e6b7d3630557\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.909579 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.925817 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qdfc\" (UniqueName: \"kubernetes.io/projected/2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a-kube-api-access-6qdfc\") pod \"machine-config-controller-84d6567774-966nr\" (UID: \"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.958700 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvvnx\" (UniqueName: \"kubernetes.io/projected/11b78073-cc4a-4a6f-89ab-631fde4b3371-kube-api-access-gvvnx\") pod \"console-f9d7485db-rlqtz\" (UID: \"11b78073-cc4a-4a6f-89ab-631fde4b3371\") " pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.967637 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.970718 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7jt5\" (UniqueName: \"kubernetes.io/projected/018c45cc-8cfa-497b-b6cf-25b10c694c58-kube-api-access-b7jt5\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.977780 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6snqr\" (UniqueName: \"kubernetes.io/projected/f15954a6-2036-4c32-a8b6-bc8e227d0fcd-kube-api-access-6snqr\") pod \"control-plane-machine-set-operator-78cbb6b69f-vjj5j\" (UID: \"f15954a6-2036-4c32-a8b6-bc8e227d0fcd\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.990981 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pwpc6"] Mar 10 00:09:51 crc kubenswrapper[4994]: I0310 00:09:51.993904 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq5dn\" (UniqueName: \"kubernetes.io/projected/5cc67063-d02f-4cb9-a15d-0d0a5c457e6e-kube-api-access-hq5dn\") pod \"olm-operator-6b444d44fb-c4cdv\" (UID: \"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:52 crc kubenswrapper[4994]: W0310 00:09:52.013086 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ed61f01_8d13_4883_ac58_0e998df5c20d.slice/crio-41d7b08bd6160997086ea083f385225cccd4bb088f59fd6fb1f00cf222b305b5 WatchSource:0}: Error finding container 41d7b08bd6160997086ea083f385225cccd4bb088f59fd6fb1f00cf222b305b5: Status 404 returned error can't find the container with id 41d7b08bd6160997086ea083f385225cccd4bb088f59fd6fb1f00cf222b305b5 Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.016225 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7278l\" (UniqueName: \"kubernetes.io/projected/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-kube-api-access-7278l\") pod \"marketplace-operator-79b997595-tgf68\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.042754 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgr88\" (UniqueName: \"kubernetes.io/projected/1377f73a-df08-4450-afa1-960e15891141-kube-api-access-dgr88\") pod \"dns-default-wv4d4\" (UID: \"1377f73a-df08-4450-afa1-960e15891141\") " pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.059633 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxzm2\" (UniqueName: \"kubernetes.io/projected/70d3076a-1af2-4aed-93ac-8dbbebd7e7d5-kube-api-access-qxzm2\") pod \"packageserver-d55dfcdfc-d8g97\" (UID: \"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.059970 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.071533 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.075776 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29f559d0-b505-4855-91e3-e46804b0c9f1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4xmwf\" (UID: \"29f559d0-b505-4855-91e3-e46804b0c9f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.095840 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhnvv\" (UniqueName: \"kubernetes.io/projected/3ffed56d-e2ab-4fa9-9dac-98c382395f2f-kube-api-access-lhnvv\") pod \"migrator-59844c95c7-fkpf5\" (UID: \"3ffed56d-e2ab-4fa9-9dac-98c382395f2f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.120308 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.122589 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pv9b\" (UniqueName: \"kubernetes.io/projected/49f58ba5-3573-4894-a320-fcf4ca4e50f1-kube-api-access-4pv9b\") pod \"catalog-operator-68c6474976-bkq7b\" (UID: \"49f58ba5-3573-4894-a320-fcf4ca4e50f1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.135959 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv5f5\" (UniqueName: \"kubernetes.io/projected/2204937d-9632-46e6-8f26-0cea8593d1a5-kube-api-access-mv5f5\") pod \"package-server-manager-789f6589d5-cvds8\" (UID: \"2204937d-9632-46e6-8f26-0cea8593d1a5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.152679 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.162324 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.162772 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/018c45cc-8cfa-497b-b6cf-25b10c694c58-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xhzh4\" (UID: \"018c45cc-8cfa-497b-b6cf-25b10c694c58\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.182721 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.183709 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.194526 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.213663 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.214477 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.223938 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.244839 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lxxqb"] Mar 10 00:09:52 crc kubenswrapper[4994]: W0310 00:09:52.269471 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5937dfbb_0da7_439c_94cb_e0e1f658d464.slice/crio-8831fb3579ee9cdb923ec4d8c1aef2294468f33bab907bd5867001319278e8ad WatchSource:0}: Error finding container 8831fb3579ee9cdb923ec4d8c1aef2294468f33bab907bd5867001319278e8ad: Status 404 returned error can't find the container with id 8831fb3579ee9cdb923ec4d8c1aef2294468f33bab907bd5867001319278e8ad Mar 10 00:09:52 crc kubenswrapper[4994]: W0310 00:09:52.271336 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac5d8ed1_6cb6_4ed6_b0b4_7a2c795dbb21.slice/crio-47cdd0fc5df08b42eefeda9e713ce29bf75c312c7780a55dff7b457658f9ecb2 WatchSource:0}: Error finding container 47cdd0fc5df08b42eefeda9e713ce29bf75c312c7780a55dff7b457658f9ecb2: Status 404 returned error can't find the container with id 47cdd0fc5df08b42eefeda9e713ce29bf75c312c7780a55dff7b457658f9ecb2 Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.274595 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spglq\" (UniqueName: \"kubernetes.io/projected/10eead56-2e9b-4d48-ab81-d1638b3cdddc-kube-api-access-spglq\") pod \"multus-admission-controller-857f4d67dd-vh5ns\" (UID: \"10eead56-2e9b-4d48-ab81-d1638b3cdddc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.274710 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0ad3539f-9691-4344-9c7f-1b015c5e3b3d-signing-key\") pod \"service-ca-9c57cc56f-pvfj5\" (UID: \"0ad3539f-9691-4344-9c7f-1b015c5e3b3d\") " pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.274784 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb21f66e-5c18-49bb-8146-8185434e7c2f-config\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.274851 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4fvw\" (UniqueName: \"kubernetes.io/projected/a1456dd8-5038-4bcc-8f19-51325ac84c02-kube-api-access-m4fvw\") pod \"auto-csr-approver-29551688-9zsf6\" (UID: \"a1456dd8-5038-4bcc-8f19-51325ac84c02\") " pod="openshift-infra/auto-csr-approver-29551688-9zsf6" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.274980 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-metrics-certs\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275043 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28b5630a-9f96-453c-ac88-70d75b7d438d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5t2rc\" (UID: \"28b5630a-9f96-453c-ac88-70d75b7d438d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275090 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngwjf\" (UniqueName: \"kubernetes.io/projected/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-kube-api-access-ngwjf\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275117 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/964eb9a7-a580-44d4-b5e5-fe84d085823c-metrics-tls\") pod \"dns-operator-744455d44c-6sx4w\" (UID: \"964eb9a7-a580-44d4-b5e5-fe84d085823c\") " pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275152 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r68q\" (UniqueName: \"kubernetes.io/projected/1e80a388-91b3-42f1-9ee2-70ab4850652d-kube-api-access-9r68q\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275179 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e80a388-91b3-42f1-9ee2-70ab4850652d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275211 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbrkc\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-kube-api-access-kbrkc\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275251 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e80a388-91b3-42f1-9ee2-70ab4850652d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275279 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-trusted-ca\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275321 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-registry-certificates\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275353 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkgzm\" (UniqueName: \"kubernetes.io/projected/a2bd9787-6df4-492a-8cab-18201a143385-kube-api-access-wkgzm\") pod \"service-ca-operator-777779d784-nf5dh\" (UID: \"a2bd9787-6df4-492a-8cab-18201a143385\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275386 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/eb21f66e-5c18-49bb-8146-8185434e7c2f-etcd-service-ca\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275465 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-secret-volume\") pod \"collect-profiles-29551680-lqhzx\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275491 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1e80a388-91b3-42f1-9ee2-70ab4850652d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275540 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/10eead56-2e9b-4d48-ab81-d1638b3cdddc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vh5ns\" (UID: \"10eead56-2e9b-4d48-ab81-d1638b3cdddc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275564 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eb21f66e-5c18-49bb-8146-8185434e7c2f-etcd-client\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275623 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5448e53f-3b74-47f1-9b28-705f36fd6ea3-images\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275647 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx7fb\" (UniqueName: \"kubernetes.io/projected/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-kube-api-access-fx7fb\") pod \"collect-profiles-29551680-lqhzx\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275697 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9rk4\" (UniqueName: \"kubernetes.io/projected/4fb67636-fcba-4975-a460-403cd6ee9c25-kube-api-access-d9rk4\") pod \"downloads-7954f5f757-8lrmb\" (UID: \"4fb67636-fcba-4975-a460-403cd6ee9c25\") " pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275724 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zmr8\" (UniqueName: \"kubernetes.io/projected/0779a70e-ebf5-4e98-87ea-43017b8d1e46-kube-api-access-2zmr8\") pod \"image-pruner-29551680-sz8pz\" (UID: \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\") " pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275745 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2bd9787-6df4-492a-8cab-18201a143385-serving-cert\") pod \"service-ca-operator-777779d784-nf5dh\" (UID: \"a2bd9787-6df4-492a-8cab-18201a143385\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275792 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28b5630a-9f96-453c-ac88-70d75b7d438d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5t2rc\" (UID: \"28b5630a-9f96-453c-ac88-70d75b7d438d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275814 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/295cba62-fd24-4245-8773-866ee134a29e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275836 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-default-certificate\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275857 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0779a70e-ebf5-4e98-87ea-43017b8d1e46-serviceca\") pod \"image-pruner-29551680-sz8pz\" (UID: \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\") " pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275909 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/295cba62-fd24-4245-8773-866ee134a29e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275932 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-service-ca-bundle\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.275976 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jkd8t\" (UID: \"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276000 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8fxv\" (UniqueName: \"kubernetes.io/projected/e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4-kube-api-access-q8fxv\") pod \"kube-storage-version-migrator-operator-b67b599dd-jkd8t\" (UID: \"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276023 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsk4r\" (UniqueName: \"kubernetes.io/projected/964eb9a7-a580-44d4-b5e5-fe84d085823c-kube-api-access-tsk4r\") pod \"dns-operator-744455d44c-6sx4w\" (UID: \"964eb9a7-a580-44d4-b5e5-fe84d085823c\") " pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276048 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5448e53f-3b74-47f1-9b28-705f36fd6ea3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276088 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28b5630a-9f96-453c-ac88-70d75b7d438d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5t2rc\" (UID: \"28b5630a-9f96-453c-ac88-70d75b7d438d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276111 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb21f66e-5c18-49bb-8146-8185434e7c2f-serving-cert\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276173 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276224 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-registry-tls\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276247 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-config-volume\") pod \"collect-profiles-29551680-lqhzx\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276268 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-bound-sa-token\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276288 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z95f9\" (UniqueName: \"kubernetes.io/projected/0ad3539f-9691-4344-9c7f-1b015c5e3b3d-kube-api-access-z95f9\") pod \"service-ca-9c57cc56f-pvfj5\" (UID: \"0ad3539f-9691-4344-9c7f-1b015c5e3b3d\") " pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276313 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jkd8t\" (UID: \"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276336 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0ad3539f-9691-4344-9c7f-1b015c5e3b3d-signing-cabundle\") pod \"service-ca-9c57cc56f-pvfj5\" (UID: \"0ad3539f-9691-4344-9c7f-1b015c5e3b3d\") " pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276359 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-529vv\" (UniqueName: \"kubernetes.io/projected/5448e53f-3b74-47f1-9b28-705f36fd6ea3-kube-api-access-529vv\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276393 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5448e53f-3b74-47f1-9b28-705f36fd6ea3-proxy-tls\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276416 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/eb21f66e-5c18-49bb-8146-8185434e7c2f-etcd-ca\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276454 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-stats-auth\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276475 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2bd9787-6df4-492a-8cab-18201a143385-config\") pod \"service-ca-operator-777779d784-nf5dh\" (UID: \"a2bd9787-6df4-492a-8cab-18201a143385\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.276500 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glwk4\" (UniqueName: \"kubernetes.io/projected/eb21f66e-5c18-49bb-8146-8185434e7c2f-kube-api-access-glwk4\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.279511 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:09:52 crc kubenswrapper[4994]: E0310 00:09:52.281297 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:52.781278505 +0000 UTC m=+206.954985254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.306742 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.311937 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-m6jnx"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.314653 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.324674 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mzg85"] Mar 10 00:09:52 crc kubenswrapper[4994]: W0310 00:09:52.350932 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe6f59a5_bf2e_4926_b6f2_a18b4cb5479d.slice/crio-f8a9fca1baf9c26e7d4e01430c74bc08dfce2dc2a5bf912289541ecfef313f55 WatchSource:0}: Error finding container f8a9fca1baf9c26e7d4e01430c74bc08dfce2dc2a5bf912289541ecfef313f55: Status 404 returned error can't find the container with id f8a9fca1baf9c26e7d4e01430c74bc08dfce2dc2a5bf912289541ecfef313f55 Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.383196 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.383848 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384067 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/10eead56-2e9b-4d48-ab81-d1638b3cdddc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vh5ns\" (UID: \"10eead56-2e9b-4d48-ab81-d1638b3cdddc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384096 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eb21f66e-5c18-49bb-8146-8185434e7c2f-etcd-client\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384136 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5448e53f-3b74-47f1-9b28-705f36fd6ea3-images\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: E0310 00:09:52.384184 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:52.884158921 +0000 UTC m=+207.057865670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384186 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx7fb\" (UniqueName: \"kubernetes.io/projected/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-kube-api-access-fx7fb\") pod \"collect-profiles-29551680-lqhzx\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384251 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9rk4\" (UniqueName: \"kubernetes.io/projected/4fb67636-fcba-4975-a460-403cd6ee9c25-kube-api-access-d9rk4\") pod \"downloads-7954f5f757-8lrmb\" (UID: \"4fb67636-fcba-4975-a460-403cd6ee9c25\") " pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384278 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zmr8\" (UniqueName: \"kubernetes.io/projected/0779a70e-ebf5-4e98-87ea-43017b8d1e46-kube-api-access-2zmr8\") pod \"image-pruner-29551680-sz8pz\" (UID: \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\") " pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384312 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2bd9787-6df4-492a-8cab-18201a143385-serving-cert\") pod \"service-ca-operator-777779d784-nf5dh\" (UID: \"a2bd9787-6df4-492a-8cab-18201a143385\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384340 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-plugins-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384383 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28b5630a-9f96-453c-ac88-70d75b7d438d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5t2rc\" (UID: \"28b5630a-9f96-453c-ac88-70d75b7d438d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384420 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/295cba62-fd24-4245-8773-866ee134a29e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384435 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-default-certificate\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384453 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0779a70e-ebf5-4e98-87ea-43017b8d1e46-serviceca\") pod \"image-pruner-29551680-sz8pz\" (UID: \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\") " pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384467 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/40e41a4c-dba3-4862-9f06-c59c538785be-node-bootstrap-token\") pod \"machine-config-server-47fkz\" (UID: \"40e41a4c-dba3-4862-9f06-c59c538785be\") " pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384502 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/295cba62-fd24-4245-8773-866ee134a29e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384518 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-service-ca-bundle\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384537 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-socket-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384576 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jkd8t\" (UID: \"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384592 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8fxv\" (UniqueName: \"kubernetes.io/projected/e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4-kube-api-access-q8fxv\") pod \"kube-storage-version-migrator-operator-b67b599dd-jkd8t\" (UID: \"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384608 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsk4r\" (UniqueName: \"kubernetes.io/projected/964eb9a7-a580-44d4-b5e5-fe84d085823c-kube-api-access-tsk4r\") pod \"dns-operator-744455d44c-6sx4w\" (UID: \"964eb9a7-a580-44d4-b5e5-fe84d085823c\") " pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384623 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5448e53f-3b74-47f1-9b28-705f36fd6ea3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384664 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28b5630a-9f96-453c-ac88-70d75b7d438d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5t2rc\" (UID: \"28b5630a-9f96-453c-ac88-70d75b7d438d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384679 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb21f66e-5c18-49bb-8146-8185434e7c2f-serving-cert\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384740 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384759 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tkcb\" (UniqueName: \"kubernetes.io/projected/7fd7640d-700a-420e-b15f-7f681090727b-kube-api-access-7tkcb\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384779 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-registry-tls\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384796 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-config-volume\") pod \"collect-profiles-29551680-lqhzx\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384828 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-bound-sa-token\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384844 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z95f9\" (UniqueName: \"kubernetes.io/projected/0ad3539f-9691-4344-9c7f-1b015c5e3b3d-kube-api-access-z95f9\") pod \"service-ca-9c57cc56f-pvfj5\" (UID: \"0ad3539f-9691-4344-9c7f-1b015c5e3b3d\") " pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384896 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jkd8t\" (UID: \"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384914 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0ad3539f-9691-4344-9c7f-1b015c5e3b3d-signing-cabundle\") pod \"service-ca-9c57cc56f-pvfj5\" (UID: \"0ad3539f-9691-4344-9c7f-1b015c5e3b3d\") " pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384938 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-529vv\" (UniqueName: \"kubernetes.io/projected/5448e53f-3b74-47f1-9b28-705f36fd6ea3-kube-api-access-529vv\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384945 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/295cba62-fd24-4245-8773-866ee134a29e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384980 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5448e53f-3b74-47f1-9b28-705f36fd6ea3-proxy-tls\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.384998 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fzzb\" (UniqueName: \"kubernetes.io/projected/40e41a4c-dba3-4862-9f06-c59c538785be-kube-api-access-6fzzb\") pod \"machine-config-server-47fkz\" (UID: \"40e41a4c-dba3-4862-9f06-c59c538785be\") " pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385025 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/eb21f66e-5c18-49bb-8146-8185434e7c2f-etcd-ca\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385042 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-mountpoint-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385112 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-stats-auth\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385130 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84x8v\" (UniqueName: \"kubernetes.io/projected/22c138b8-4431-4695-be3f-0ea008d21f30-kube-api-access-84x8v\") pod \"ingress-canary-5hvbc\" (UID: \"22c138b8-4431-4695-be3f-0ea008d21f30\") " pod="openshift-ingress-canary/ingress-canary-5hvbc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385147 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2bd9787-6df4-492a-8cab-18201a143385-config\") pod \"service-ca-operator-777779d784-nf5dh\" (UID: \"a2bd9787-6df4-492a-8cab-18201a143385\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385176 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glwk4\" (UniqueName: \"kubernetes.io/projected/eb21f66e-5c18-49bb-8146-8185434e7c2f-kube-api-access-glwk4\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385205 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spglq\" (UniqueName: \"kubernetes.io/projected/10eead56-2e9b-4d48-ab81-d1638b3cdddc-kube-api-access-spglq\") pod \"multus-admission-controller-857f4d67dd-vh5ns\" (UID: \"10eead56-2e9b-4d48-ab81-d1638b3cdddc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385257 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0ad3539f-9691-4344-9c7f-1b015c5e3b3d-signing-key\") pod \"service-ca-9c57cc56f-pvfj5\" (UID: \"0ad3539f-9691-4344-9c7f-1b015c5e3b3d\") " pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385278 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb21f66e-5c18-49bb-8146-8185434e7c2f-config\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385297 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4fvw\" (UniqueName: \"kubernetes.io/projected/a1456dd8-5038-4bcc-8f19-51325ac84c02-kube-api-access-m4fvw\") pod \"auto-csr-approver-29551688-9zsf6\" (UID: \"a1456dd8-5038-4bcc-8f19-51325ac84c02\") " pod="openshift-infra/auto-csr-approver-29551688-9zsf6" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385367 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-metrics-certs\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385386 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28b5630a-9f96-453c-ac88-70d75b7d438d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5t2rc\" (UID: \"28b5630a-9f96-453c-ac88-70d75b7d438d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385439 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngwjf\" (UniqueName: \"kubernetes.io/projected/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-kube-api-access-ngwjf\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385472 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/964eb9a7-a580-44d4-b5e5-fe84d085823c-metrics-tls\") pod \"dns-operator-744455d44c-6sx4w\" (UID: \"964eb9a7-a580-44d4-b5e5-fe84d085823c\") " pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385490 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r68q\" (UniqueName: \"kubernetes.io/projected/1e80a388-91b3-42f1-9ee2-70ab4850652d-kube-api-access-9r68q\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385509 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22c138b8-4431-4695-be3f-0ea008d21f30-cert\") pod \"ingress-canary-5hvbc\" (UID: \"22c138b8-4431-4695-be3f-0ea008d21f30\") " pod="openshift-ingress-canary/ingress-canary-5hvbc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385535 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e80a388-91b3-42f1-9ee2-70ab4850652d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385549 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-csi-data-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385574 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbrkc\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-kube-api-access-kbrkc\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385612 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-registration-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385650 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e80a388-91b3-42f1-9ee2-70ab4850652d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385672 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-trusted-ca\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385706 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-registry-certificates\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385763 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkgzm\" (UniqueName: \"kubernetes.io/projected/a2bd9787-6df4-492a-8cab-18201a143385-kube-api-access-wkgzm\") pod \"service-ca-operator-777779d784-nf5dh\" (UID: \"a2bd9787-6df4-492a-8cab-18201a143385\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385798 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/eb21f66e-5c18-49bb-8146-8185434e7c2f-etcd-service-ca\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385846 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-secret-volume\") pod \"collect-profiles-29551680-lqhzx\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385870 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1e80a388-91b3-42f1-9ee2-70ab4850652d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.385933 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/40e41a4c-dba3-4862-9f06-c59c538785be-certs\") pod \"machine-config-server-47fkz\" (UID: \"40e41a4c-dba3-4862-9f06-c59c538785be\") " pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.390448 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2bd9787-6df4-492a-8cab-18201a143385-config\") pod \"service-ca-operator-777779d784-nf5dh\" (UID: \"a2bd9787-6df4-492a-8cab-18201a143385\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.394351 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28b5630a-9f96-453c-ac88-70d75b7d438d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5t2rc\" (UID: \"28b5630a-9f96-453c-ac88-70d75b7d438d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.395233 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5448e53f-3b74-47f1-9b28-705f36fd6ea3-images\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.395292 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/10eead56-2e9b-4d48-ab81-d1638b3cdddc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vh5ns\" (UID: \"10eead56-2e9b-4d48-ab81-d1638b3cdddc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.395421 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/964eb9a7-a580-44d4-b5e5-fe84d085823c-metrics-tls\") pod \"dns-operator-744455d44c-6sx4w\" (UID: \"964eb9a7-a580-44d4-b5e5-fe84d085823c\") " pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.396166 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5448e53f-3b74-47f1-9b28-705f36fd6ea3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: E0310 00:09:52.411346 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:52.911325092 +0000 UTC m=+207.085031841 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.400006 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2bd9787-6df4-492a-8cab-18201a143385-serving-cert\") pod \"service-ca-operator-777779d784-nf5dh\" (UID: \"a2bd9787-6df4-492a-8cab-18201a143385\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.412230 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-config-volume\") pod \"collect-profiles-29551680-lqhzx\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.412481 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-default-certificate\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.401026 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jkd8t\" (UID: \"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.412991 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28b5630a-9f96-453c-ac88-70d75b7d438d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5t2rc\" (UID: \"28b5630a-9f96-453c-ac88-70d75b7d438d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.413952 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/295cba62-fd24-4245-8773-866ee134a29e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.415519 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e80a388-91b3-42f1-9ee2-70ab4850652d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.415965 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0779a70e-ebf5-4e98-87ea-43017b8d1e46-serviceca\") pod \"image-pruner-29551680-sz8pz\" (UID: \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\") " pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.416247 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-metrics-certs\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.417494 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-registry-tls\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.419549 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-service-ca-bundle\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.425827 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-secret-volume\") pod \"collect-profiles-29551680-lqhzx\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.429710 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jkd8t\" (UID: \"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.429743 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-trusted-ca\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.429812 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e80a388-91b3-42f1-9ee2-70ab4850652d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.433565 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0ad3539f-9691-4344-9c7f-1b015c5e3b3d-signing-cabundle\") pod \"service-ca-9c57cc56f-pvfj5\" (UID: \"0ad3539f-9691-4344-9c7f-1b015c5e3b3d\") " pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.434740 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5448e53f-3b74-47f1-9b28-705f36fd6ea3-proxy-tls\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.438294 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0ad3539f-9691-4344-9c7f-1b015c5e3b3d-signing-key\") pod \"service-ca-9c57cc56f-pvfj5\" (UID: \"0ad3539f-9691-4344-9c7f-1b015c5e3b3d\") " pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.438812 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb21f66e-5c18-49bb-8146-8185434e7c2f-config\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.438851 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/eb21f66e-5c18-49bb-8146-8185434e7c2f-etcd-service-ca\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.438919 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eb21f66e-5c18-49bb-8146-8185434e7c2f-etcd-client\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.440739 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/eb21f66e-5c18-49bb-8146-8185434e7c2f-etcd-ca\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.441345 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-registry-certificates\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.441913 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-stats-auth\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.442601 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9rk4\" (UniqueName: \"kubernetes.io/projected/4fb67636-fcba-4975-a460-403cd6ee9c25-kube-api-access-d9rk4\") pod \"downloads-7954f5f757-8lrmb\" (UID: \"4fb67636-fcba-4975-a460-403cd6ee9c25\") " pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.453005 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx7fb\" (UniqueName: \"kubernetes.io/projected/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-kube-api-access-fx7fb\") pod \"collect-profiles-29551680-lqhzx\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.470016 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.470720 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.474701 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb21f66e-5c18-49bb-8146-8185434e7c2f-serving-cert\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.480027 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.486137 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zmr8\" (UniqueName: \"kubernetes.io/projected/0779a70e-ebf5-4e98-87ea-43017b8d1e46-kube-api-access-2zmr8\") pod \"image-pruner-29551680-sz8pz\" (UID: \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\") " pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.486931 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:52 crc kubenswrapper[4994]: E0310 00:09:52.487188 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:52.987166531 +0000 UTC m=+207.160873280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487344 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tkcb\" (UniqueName: \"kubernetes.io/projected/7fd7640d-700a-420e-b15f-7f681090727b-kube-api-access-7tkcb\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487373 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487416 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fzzb\" (UniqueName: \"kubernetes.io/projected/40e41a4c-dba3-4862-9f06-c59c538785be-kube-api-access-6fzzb\") pod \"machine-config-server-47fkz\" (UID: \"40e41a4c-dba3-4862-9f06-c59c538785be\") " pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487433 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-mountpoint-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487460 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84x8v\" (UniqueName: \"kubernetes.io/projected/22c138b8-4431-4695-be3f-0ea008d21f30-kube-api-access-84x8v\") pod \"ingress-canary-5hvbc\" (UID: \"22c138b8-4431-4695-be3f-0ea008d21f30\") " pod="openshift-ingress-canary/ingress-canary-5hvbc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487541 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22c138b8-4431-4695-be3f-0ea008d21f30-cert\") pod \"ingress-canary-5hvbc\" (UID: \"22c138b8-4431-4695-be3f-0ea008d21f30\") " pod="openshift-ingress-canary/ingress-canary-5hvbc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487563 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-csi-data-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487584 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-registration-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487619 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/40e41a4c-dba3-4862-9f06-c59c538785be-certs\") pod \"machine-config-server-47fkz\" (UID: \"40e41a4c-dba3-4862-9f06-c59c538785be\") " pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487648 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-plugins-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.487712 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-mountpoint-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: E0310 00:09:52.488167 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:52.988152595 +0000 UTC m=+207.161859344 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.488594 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-csi-data-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.488645 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-registration-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.488689 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-plugins-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.491012 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/40e41a4c-dba3-4862-9f06-c59c538785be-node-bootstrap-token\") pod \"machine-config-server-47fkz\" (UID: \"40e41a4c-dba3-4862-9f06-c59c538785be\") " pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.491087 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-socket-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.491436 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7fd7640d-700a-420e-b15f-7f681090727b-socket-dir\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.498010 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4fvw\" (UniqueName: \"kubernetes.io/projected/a1456dd8-5038-4bcc-8f19-51325ac84c02-kube-api-access-m4fvw\") pod \"auto-csr-approver-29551688-9zsf6\" (UID: \"a1456dd8-5038-4bcc-8f19-51325ac84c02\") " pod="openshift-infra/auto-csr-approver-29551688-9zsf6" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.498156 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.498541 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glwk4\" (UniqueName: \"kubernetes.io/projected/eb21f66e-5c18-49bb-8146-8185434e7c2f-kube-api-access-glwk4\") pod \"etcd-operator-b45778765-hqlnc\" (UID: \"eb21f66e-5c18-49bb-8146-8185434e7c2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.498611 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/40e41a4c-dba3-4862-9f06-c59c538785be-certs\") pod \"machine-config-server-47fkz\" (UID: \"40e41a4c-dba3-4862-9f06-c59c538785be\") " pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.499400 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22c138b8-4431-4695-be3f-0ea008d21f30-cert\") pod \"ingress-canary-5hvbc\" (UID: \"22c138b8-4431-4695-be3f-0ea008d21f30\") " pod="openshift-ingress-canary/ingress-canary-5hvbc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.499794 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/40e41a4c-dba3-4862-9f06-c59c538785be-node-bootstrap-token\") pod \"machine-config-server-47fkz\" (UID: \"40e41a4c-dba3-4862-9f06-c59c538785be\") " pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.500542 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.502739 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.514443 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8fxv\" (UniqueName: \"kubernetes.io/projected/e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4-kube-api-access-q8fxv\") pod \"kube-storage-version-migrator-operator-b67b599dd-jkd8t\" (UID: \"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.535828 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsk4r\" (UniqueName: \"kubernetes.io/projected/964eb9a7-a580-44d4-b5e5-fe84d085823c-kube-api-access-tsk4r\") pod \"dns-operator-744455d44c-6sx4w\" (UID: \"964eb9a7-a580-44d4-b5e5-fe84d085823c\") " pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.570383 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-bound-sa-token\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.592906 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:52 crc kubenswrapper[4994]: E0310 00:09:52.593540 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.093514074 +0000 UTC m=+207.267220823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.595705 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.596040 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spglq\" (UniqueName: \"kubernetes.io/projected/10eead56-2e9b-4d48-ab81-d1638b3cdddc-kube-api-access-spglq\") pod \"multus-admission-controller-857f4d67dd-vh5ns\" (UID: \"10eead56-2e9b-4d48-ab81-d1638b3cdddc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.608482 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r68q\" (UniqueName: \"kubernetes.io/projected/1e80a388-91b3-42f1-9ee2-70ab4850652d-kube-api-access-9r68q\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.622457 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.628470 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xm7j6"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.629550 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.631539 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.640305 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" event={"ID":"5937dfbb-0da7-439c-94cb-e0e1f658d464","Type":"ContainerStarted","Data":"8831fb3579ee9cdb923ec4d8c1aef2294468f33bab907bd5867001319278e8ad"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.642801 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" event={"ID":"1109e060-ef32-407d-8283-eba65e1d4eaa","Type":"ContainerStarted","Data":"c0c0f64e7f4625d4eaa33d78bcd1621a09cc39a5950f18848d396639c17eeac8"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.642903 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" event={"ID":"1109e060-ef32-407d-8283-eba65e1d4eaa","Type":"ContainerStarted","Data":"cd6bc53b06185452ec5ffc30b9345a8b0e9c6035b2724964bd1af174b6193e59"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.644335 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" event={"ID":"06c3fca2-64b6-47e2-885f-948eac331c10","Type":"ContainerStarted","Data":"73f546835be1acbcf5b5076338b328854de718376e12b8334ef017808360e5a8"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.644983 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkgzm\" (UniqueName: \"kubernetes.io/projected/a2bd9787-6df4-492a-8cab-18201a143385-kube-api-access-wkgzm\") pod \"service-ca-operator-777779d784-nf5dh\" (UID: \"a2bd9787-6df4-492a-8cab-18201a143385\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.645651 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-529vv\" (UniqueName: \"kubernetes.io/projected/5448e53f-3b74-47f1-9b28-705f36fd6ea3-kube-api-access-529vv\") pod \"machine-config-operator-74547568cd-lzcrv\" (UID: \"5448e53f-3b74-47f1-9b28-705f36fd6ea3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.645755 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" event={"ID":"54ca6ee4-24c4-415f-a1b6-26f54e2992f8","Type":"ContainerStarted","Data":"68e7ebb7b9a0fa967b84de70c209836439efd368a03ea8c0304dd46c8d9878be"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.651061 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.687269 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" event={"ID":"3a5ced5c-b690-4a1c-8d48-bbf789366816","Type":"ContainerStarted","Data":"a4f316c89cad6047ed47de9ae8a2fe8006cd6a1653d6e93a18547fa5d8c4c263"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.687920 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.694857 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbrkc\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-kube-api-access-kbrkc\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.695605 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: E0310 00:09:52.696049 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.19603645 +0000 UTC m=+207.369743199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.701685 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28b5630a-9f96-453c-ac88-70d75b7d438d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5t2rc\" (UID: \"28b5630a-9f96-453c-ac88-70d75b7d438d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.709647 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngwjf\" (UniqueName: \"kubernetes.io/projected/f9b1c3de-e5a3-467f-929b-afb8687fb7f0-kube-api-access-ngwjf\") pod \"router-default-5444994796-x6s5d\" (UID: \"f9b1c3de-e5a3-467f-929b-afb8687fb7f0\") " pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.715138 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551688-9zsf6" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.722810 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.734805 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.739217 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" event={"ID":"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21","Type":"ContainerStarted","Data":"47cdd0fc5df08b42eefeda9e713ce29bf75c312c7780a55dff7b457658f9ecb2"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.739657 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1e80a388-91b3-42f1-9ee2-70ab4850652d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hmt98\" (UID: \"1e80a388-91b3-42f1-9ee2-70ab4850652d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.742410 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.748564 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" event={"ID":"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339","Type":"ContainerStarted","Data":"63fb8a0ac5bfc9c8de9bc34e4233fe302c0bdff0000c0a3baace5279815afe2c"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.754330 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" event={"ID":"6ed61f01-8d13-4883-ac58-0e998df5c20d","Type":"ContainerStarted","Data":"93848edcd94d1b8f74a090519d280473073418447e4977799669a8e2feb77dcb"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.754394 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" event={"ID":"6ed61f01-8d13-4883-ac58-0e998df5c20d","Type":"ContainerStarted","Data":"41d7b08bd6160997086ea083f385225cccd4bb088f59fd6fb1f00cf222b305b5"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.754976 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.755434 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z95f9\" (UniqueName: \"kubernetes.io/projected/0ad3539f-9691-4344-9c7f-1b015c5e3b3d-kube-api-access-z95f9\") pod \"service-ca-9c57cc56f-pvfj5\" (UID: \"0ad3539f-9691-4344-9c7f-1b015c5e3b3d\") " pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.755569 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tgf68"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.757265 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" event={"ID":"903778b5-0c60-42d6-8773-a1345817fe1f","Type":"ContainerStarted","Data":"8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.757305 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" event={"ID":"903778b5-0c60-42d6-8773-a1345817fe1f","Type":"ContainerStarted","Data":"579d8e47d1cae1e88f269db0e29bcd43ee29c56b451d1f988d01fa0b8de660ec"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.757782 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.759931 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" event={"ID":"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d","Type":"ContainerStarted","Data":"f8a9fca1baf9c26e7d4e01430c74bc08dfce2dc2a5bf912289541ecfef313f55"} Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.772420 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-966nr"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.775587 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tkcb\" (UniqueName: \"kubernetes.io/projected/7fd7640d-700a-420e-b15f-7f681090727b-kube-api-access-7tkcb\") pod \"csi-hostpathplugin-cf2xx\" (UID: \"7fd7640d-700a-420e-b15f-7f681090727b\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.779460 4994 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-fxpkq container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" start-of-body= Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.779768 4994 patch_prober.go:28] interesting pod/console-operator-58897d9998-pwpc6 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.779801 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" podUID="6ed61f01-8d13-4883-ac58-0e998df5c20d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.781696 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.779527 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" podUID="903778b5-0c60-42d6-8773-a1345817fe1f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.787186 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.799609 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:52 crc kubenswrapper[4994]: E0310 00:09:52.799963 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.299948031 +0000 UTC m=+207.473654780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.802012 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rlqtz"] Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.806105 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84x8v\" (UniqueName: \"kubernetes.io/projected/22c138b8-4431-4695-be3f-0ea008d21f30-kube-api-access-84x8v\") pod \"ingress-canary-5hvbc\" (UID: \"22c138b8-4431-4695-be3f-0ea008d21f30\") " pod="openshift-ingress-canary/ingress-canary-5hvbc" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.818110 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fzzb\" (UniqueName: \"kubernetes.io/projected/40e41a4c-dba3-4862-9f06-c59c538785be-kube-api-access-6fzzb\") pod \"machine-config-server-47fkz\" (UID: \"40e41a4c-dba3-4862-9f06-c59c538785be\") " pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.850072 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.885515 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.908836 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:52 crc kubenswrapper[4994]: W0310 00:09:52.911775 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb85bbdaa_daa8_4c69_abf9_9f1200eb07cd.slice/crio-18370ebb096ba95df384c4822d63f9eeb86d553280650a21652dc981b373eee5 WatchSource:0}: Error finding container 18370ebb096ba95df384c4822d63f9eeb86d553280650a21652dc981b373eee5: Status 404 returned error can't find the container with id 18370ebb096ba95df384c4822d63f9eeb86d553280650a21652dc981b373eee5 Mar 10 00:09:52 crc kubenswrapper[4994]: E0310 00:09:52.912470 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.412444519 +0000 UTC m=+207.586151268 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:52 crc kubenswrapper[4994]: I0310 00:09:52.940603 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" Mar 10 00:09:52 crc kubenswrapper[4994]: W0310 00:09:52.959192 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11b78073_cc4a_4a6f_89ab_631fde4b3371.slice/crio-463e9f6c752ae1494ce870d0e395342f9e1c8c71349ad766cac312fc6c1a6a3f WatchSource:0}: Error finding container 463e9f6c752ae1494ce870d0e395342f9e1c8c71349ad766cac312fc6c1a6a3f: Status 404 returned error can't find the container with id 463e9f6c752ae1494ce870d0e395342f9e1c8c71349ad766cac312fc6c1a6a3f Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.011739 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.011888 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.511842287 +0000 UTC m=+207.685549036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.012314 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.012818 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.512799541 +0000 UTC m=+207.686506290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.071571 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5hvbc" Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.105698 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-47fkz" Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.112896 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.144295 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.644260093 +0000 UTC m=+207.817966842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.190349 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wv4d4"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.227055 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.227328 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.727315832 +0000 UTC m=+207.901022581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.249926 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.315073 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.316413 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-725jp"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.322274 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.327958 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.328350 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.828325771 +0000 UTC m=+208.002032520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.331731 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.336390 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.336487 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.405728 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8lrmb"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.429709 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.430999 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:53.930985302 +0000 UTC m=+208.104692051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.437696 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" podStartSLOduration=162.437671269 podStartE2EDuration="2m42.437671269s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:53.433485645 +0000 UTC m=+207.607192414" watchObservedRunningTime="2026-03-10 00:09:53.437671269 +0000 UTC m=+207.611378018" Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.447640 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hqlnc"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.456377 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.472156 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6sx4w"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.474430 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vh5ns"] Mar 10 00:09:53 crc kubenswrapper[4994]: W0310 00:09:53.491590 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod018c45cc_8cfa_497b_b6cf_25b10c694c58.slice/crio-793a5f9e3411b0af92f6a8ea8f03e0071d26aec82b46778cf3064a7e5ee1f344 WatchSource:0}: Error finding container 793a5f9e3411b0af92f6a8ea8f03e0071d26aec82b46778cf3064a7e5ee1f344: Status 404 returned error can't find the container with id 793a5f9e3411b0af92f6a8ea8f03e0071d26aec82b46778cf3064a7e5ee1f344 Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.531499 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.537093 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.037050988 +0000 UTC m=+208.210757787 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.635152 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.639289 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.139274587 +0000 UTC m=+208.312981336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.639511 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.642883 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.644720 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.696220 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551688-9zsf6"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.708736 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.736936 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.737079 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.237058635 +0000 UTC m=+208.410765384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.737735 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.738249 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.238236015 +0000 UTC m=+208.411942764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.775907 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" event={"ID":"3a5ced5c-b690-4a1c-8d48-bbf789366816","Type":"ContainerStarted","Data":"0a60270474c65ca8a943a325db1195f8fb6a2c9bbb7e7276d1716f28a656e96e"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.780354 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" event={"ID":"964eb9a7-a580-44d4-b5e5-fe84d085823c","Type":"ContainerStarted","Data":"684c79ab646de835ef99f0fc4edc687ed78690b215f63c5c50ab9f33c1f56326"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.787053 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" event={"ID":"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5","Type":"ContainerStarted","Data":"a0cece20202ff106bc018ae86d1363c0e076439cd4076737afc7c948b110c656"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.795302 4994 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.802144 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cf2xx"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.803265 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" event={"ID":"54ca6ee4-24c4-415f-a1b6-26f54e2992f8","Type":"ContainerStarted","Data":"0f1267926bcca137db3abcb72a6c709ffad1b8249211d418fa61e8ee79ffda76"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.803780 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.804295 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29551680-sz8pz"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.804600 4994 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bkhqb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.804637 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" podUID="54ca6ee4-24c4-415f-a1b6-26f54e2992f8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.806123 4994 generic.go:334] "Generic (PLEG): container finished" podID="ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21" containerID="130fbf5ea61bd1e2ba5dc7dc75b09124c094f2fbf967022322ff72e08f29e934" exitCode=0 Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.806238 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" event={"ID":"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21","Type":"ContainerDied","Data":"130fbf5ea61bd1e2ba5dc7dc75b09124c094f2fbf967022322ff72e08f29e934"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.807247 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5hvbc"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.808710 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" event={"ID":"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8","Type":"ContainerStarted","Data":"e29a59b07d62557e79f131725545f7bbc14a1ca6dcc0ac4661855d156c889001"} Mar 10 00:09:53 crc kubenswrapper[4994]: W0310 00:09:53.809965 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5448e53f_3b74_47f1_9b28_705f36fd6ea3.slice/crio-4604c46a6249354e89435fdced2bee20c5ff8f5966359db2287131c85a900ae2 WatchSource:0}: Error finding container 4604c46a6249354e89435fdced2bee20c5ff8f5966359db2287131c85a900ae2: Status 404 returned error can't find the container with id 4604c46a6249354e89435fdced2bee20c5ff8f5966359db2287131c85a900ae2 Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.811616 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" event={"ID":"f15954a6-2036-4c32-a8b6-bc8e227d0fcd","Type":"ContainerStarted","Data":"862b0d860a139426a426969abefaaa8beec085d72eb016f7646e2a789f671e13"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.815469 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wv4d4" event={"ID":"1377f73a-df08-4450-afa1-960e15891141","Type":"ContainerStarted","Data":"f7e9ccdb2334bb8c1236d21a555661e8771e2ed311721152323ab9263229c9ad"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.819083 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" event={"ID":"10eead56-2e9b-4d48-ab81-d1638b3cdddc","Type":"ContainerStarted","Data":"34345e98c1e2e3aeb5d04ca6093bf31fe453de1f2bdaa615125c36c47cfd63b8"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.839099 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-47fkz" event={"ID":"40e41a4c-dba3-4862-9f06-c59c538785be","Type":"ContainerStarted","Data":"c731c52cc501053a9e6cb744a378eb523999983be44bf73047633cb630bcde4f"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.839410 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.839582 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.339555822 +0000 UTC m=+208.513262581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.839986 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.840483 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.340470785 +0000 UTC m=+208.514177534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.842339 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" event={"ID":"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a","Type":"ContainerStarted","Data":"576032bc27c1dd3be1fef9c613168d79e3e52a5fcf4088fa1693aa437e4e506b"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.846448 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" event={"ID":"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd","Type":"ContainerStarted","Data":"18370ebb096ba95df384c4822d63f9eeb86d553280650a21652dc981b373eee5"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.851566 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" podStartSLOduration=162.851545002 podStartE2EDuration="2m42.851545002s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:53.844644019 +0000 UTC m=+208.018350768" watchObservedRunningTime="2026-03-10 00:09:53.851545002 +0000 UTC m=+208.025251741" Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.851864 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" event={"ID":"49f58ba5-3573-4894-a320-fcf4ca4e50f1","Type":"ContainerStarted","Data":"7e18e94ab3cc40d64e97f54b8dd1dbfa47c33d18a0cfd95fd87ea04af0af538b"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.853410 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" event={"ID":"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d","Type":"ContainerStarted","Data":"562b4f6ad03fbfc61eb6934ec8bb885e8e0d48326f7f4f305e46bd49329e3596"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.854262 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5" event={"ID":"3ffed56d-e2ab-4fa9-9dac-98c382395f2f","Type":"ContainerStarted","Data":"884c06f0e4bbf6834679dd6a2a57beae9594323d06c650e327564e27b620f9af"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.854917 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" event={"ID":"018c45cc-8cfa-497b-b6cf-25b10c694c58","Type":"ContainerStarted","Data":"793a5f9e3411b0af92f6a8ea8f03e0071d26aec82b46778cf3064a7e5ee1f344"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.864193 4994 generic.go:334] "Generic (PLEG): container finished" podID="5937dfbb-0da7-439c-94cb-e0e1f658d464" containerID="2c8a718072fce38c319dd3e089acc3bde7592d06567f719b0ced5363b1177e38" exitCode=0 Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.864452 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" event={"ID":"5937dfbb-0da7-439c-94cb-e0e1f658d464","Type":"ContainerDied","Data":"2c8a718072fce38c319dd3e089acc3bde7592d06567f719b0ced5363b1177e38"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.866520 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" event={"ID":"28b5630a-9f96-453c-ac88-70d75b7d438d","Type":"ContainerStarted","Data":"09662085f4bd1674515195af789aa910121956af3b6df1fb85ec192328d054d3"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.876287 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" event={"ID":"4dbc1f03-d386-460b-81f5-e6b7d3630557","Type":"ContainerStarted","Data":"82f218da8fc6d991a5cd49e6b2a4ec167490c7cb89f8feb1f5d6a556f3224d1e"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.879811 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rlqtz" event={"ID":"11b78073-cc4a-4a6f-89ab-631fde4b3371","Type":"ContainerStarted","Data":"463e9f6c752ae1494ce870d0e395342f9e1c8c71349ad766cac312fc6c1a6a3f"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.882587 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" event={"ID":"0f2b43c0-a96d-4ea3-8d46-d6919aedf741","Type":"ContainerStarted","Data":"bae537ffa0b0b80c06d0b79407ab1e4733786fdf150560f197e315922a963b90"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.884185 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" event={"ID":"eb21f66e-5c18-49bb-8146-8185434e7c2f","Type":"ContainerStarted","Data":"6b16a5a1e886c45ca9e097a5676bc409e21a4ec9df85eba50d00f6e9744258e0"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.887444 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" event={"ID":"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d","Type":"ContainerStarted","Data":"59250ab52cbad73db0f4a15a56196056869f59efdeee140226e8cd658de70523"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.894355 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.916055 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" event={"ID":"06c3fca2-64b6-47e2-885f-948eac331c10","Type":"ContainerStarted","Data":"9d642500e4292360c763205ff2873028c27555cd29d6e96c3e3f459f6c03e625"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.919499 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" event={"ID":"29f559d0-b505-4855-91e3-e46804b0c9f1","Type":"ContainerStarted","Data":"9f81f43504d157e1822651c9895f8bb4d4e77aa9ab1f5a0a09943aa22704c4ad"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.922820 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" event={"ID":"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e","Type":"ContainerStarted","Data":"b4afdb10540d5ed0856b1c89767c0d9e08ce43a42e7723cf69f249b417909f11"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.926605 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-x6s5d" event={"ID":"f9b1c3de-e5a3-467f-929b-afb8687fb7f0","Type":"ContainerStarted","Data":"7dd32d215b043f1e255d3b0e44386da08264c8468e9857a042c5c799844f7ee2"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.929604 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8lrmb" event={"ID":"4fb67636-fcba-4975-a460-403cd6ee9c25","Type":"ContainerStarted","Data":"36382d3e5b130d311923877c1260293bc964d90ead9bdf7dbab11f1738111946"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.938716 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" event={"ID":"9a1c67e3-f6df-4b4d-b3a3-669503580446","Type":"ContainerStarted","Data":"4fad456af3276012ecddb69fdf5089880e2b26cd92986f3e857c11107a4952b1"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.940960 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.941409 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.44133772 +0000 UTC m=+208.615044469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.941734 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:53 crc kubenswrapper[4994]: E0310 00:09:53.942837 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.442820887 +0000 UTC m=+208.616527646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.960094 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-pvfj5"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.963198 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh"] Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.972973 4994 generic.go:334] "Generic (PLEG): container finished" podID="4c9cbda0-655c-4cf9-8f9a-23b3ebf37339" containerID="8e871a87e6dfc98301313edad560135c0d1c2f67516af3aa4d21ffc05dcf1c75" exitCode=0 Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.973696 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" event={"ID":"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339","Type":"ContainerDied","Data":"8e871a87e6dfc98301313edad560135c0d1c2f67516af3aa4d21ffc05dcf1c75"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.976518 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" event={"ID":"2204937d-9632-46e6-8f26-0cea8593d1a5","Type":"ContainerStarted","Data":"9819e9aedbb8ff5e71ef5f8d74f93a1e29204f606152756297a69592fbda127a"} Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.977072 4994 patch_prober.go:28] interesting pod/console-operator-58897d9998-pwpc6 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.977118 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" podUID="6ed61f01-8d13-4883-ac58-0e998df5c20d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.977155 4994 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-fxpkq container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" start-of-body= Mar 10 00:09:53 crc kubenswrapper[4994]: I0310 00:09:53.977190 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" podUID="903778b5-0c60-42d6-8773-a1345817fe1f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" Mar 10 00:09:53 crc kubenswrapper[4994]: W0310 00:09:53.997070 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ad3539f_9691_4344_9c7f_1b015c5e3b3d.slice/crio-c29635a43a370ddcef70193ceb4b233fcaa550295f8e461c5e67182d4585854b WatchSource:0}: Error finding container c29635a43a370ddcef70193ceb4b233fcaa550295f8e461c5e67182d4585854b: Status 404 returned error can't find the container with id c29635a43a370ddcef70193ceb4b233fcaa550295f8e461c5e67182d4585854b Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.042661 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.042906 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.542854262 +0000 UTC m=+208.716561051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.043283 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.043708 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.543697773 +0000 UTC m=+208.717404522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.144414 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.146200 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.646174749 +0000 UTC m=+208.819881558 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.246057 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.246327 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.746317397 +0000 UTC m=+208.920024146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.346638 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.347019 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.846998837 +0000 UTC m=+209.020705596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.347348 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.347726 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.847715475 +0000 UTC m=+209.021422224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.447076 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lgqdf" podStartSLOduration=163.447055872 podStartE2EDuration="2m43.447055872s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:54.443192296 +0000 UTC m=+208.616899045" watchObservedRunningTime="2026-03-10 00:09:54.447055872 +0000 UTC m=+208.620762621" Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.448273 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.451113 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:54.951091094 +0000 UTC m=+209.124797843 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.553715 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.554298 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.054269017 +0000 UTC m=+209.227975756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.556159 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" podStartSLOduration=162.556136193 podStartE2EDuration="2m42.556136193s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:54.552309558 +0000 UTC m=+208.726016307" watchObservedRunningTime="2026-03-10 00:09:54.556136193 +0000 UTC m=+208.729842942" Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.656329 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.656516 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.156490756 +0000 UTC m=+209.330197505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.657642 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.657986 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.157977734 +0000 UTC m=+209.331684483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.761381 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.761463 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.261447114 +0000 UTC m=+209.435153863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.761658 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.762219 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.262206743 +0000 UTC m=+209.435913492 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.862448 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.862580 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.362558826 +0000 UTC m=+209.536265595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.863222 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.864129 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.363813107 +0000 UTC m=+209.537519856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.964080 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:54 crc kubenswrapper[4994]: E0310 00:09:54.964528 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.464510418 +0000 UTC m=+209.638217167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.983627 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rlqtz" event={"ID":"11b78073-cc4a-4a6f-89ab-631fde4b3371","Type":"ContainerStarted","Data":"fdab6275ec494362430269075beb94b8c702e2909589a9fa3cd20488ebec44d8"} Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.987613 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-x6s5d" event={"ID":"f9b1c3de-e5a3-467f-929b-afb8687fb7f0","Type":"ContainerStarted","Data":"c90a584826553fb02bd29517277e0087eee6110714e60770bda7cb8fa9371309"} Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.990103 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" event={"ID":"70d3076a-1af2-4aed-93ac-8dbbebd7e7d5","Type":"ContainerStarted","Data":"2fd88818eb9689cd277d2283b6033829ffd881934c4abd7ae705c1ecc7e0971a"} Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.990620 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.993132 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" event={"ID":"5cc67063-d02f-4cb9-a15d-0d0a5c457e6e","Type":"ContainerStarted","Data":"b7d0030c69ad94cc9b98f04701b2bb8ba0e5f4ea4b650053785878dc81668b7a"} Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.993163 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.994295 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-47fkz" event={"ID":"40e41a4c-dba3-4862-9f06-c59c538785be","Type":"ContainerStarted","Data":"8028ad0ad8a4d65aab0a9577945312f04258071094f29460c386e8be32477640"} Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.995605 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551688-9zsf6" event={"ID":"a1456dd8-5038-4bcc-8f19-51325ac84c02","Type":"ContainerStarted","Data":"5834bb42526a131e58f316ca58bdd93f998331f2c74c80c7a568c0b2a5d292c8"} Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.997090 4994 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-d8g97 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:5443/healthz\": dial tcp 10.217.0.17:5443: connect: connection refused" start-of-body= Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.997130 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" podUID="70d3076a-1af2-4aed-93ac-8dbbebd7e7d5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.17:5443/healthz\": dial tcp 10.217.0.17:5443: connect: connection refused" Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.998072 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5" event={"ID":"3ffed56d-e2ab-4fa9-9dac-98c382395f2f","Type":"ContainerStarted","Data":"d7362fbe2cd97e9688561fa1ac7012eeb3d374e2fe9d4405c7e37ab649db642c"} Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.998051 4994 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-c4cdv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" start-of-body= Mar 10 00:09:54 crc kubenswrapper[4994]: I0310 00:09:54.998117 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" podUID="5cc67063-d02f-4cb9-a15d-0d0a5c457e6e" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.001718 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29551680-sz8pz" event={"ID":"0779a70e-ebf5-4e98-87ea-43017b8d1e46","Type":"ContainerStarted","Data":"b48d5517182b7a9629abeaefea2ce0d25137af9372c04a623cb57c0cb0fada84"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.005509 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" event={"ID":"964eb9a7-a580-44d4-b5e5-fe84d085823c","Type":"ContainerStarted","Data":"9dbd79c8576252b697bc5e7472d200f7ca1e77cfe0295522d819a7399fa940ea"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.005831 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-rlqtz" podStartSLOduration=164.005819153 podStartE2EDuration="2m44.005819153s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.000362376 +0000 UTC m=+209.174069135" watchObservedRunningTime="2026-03-10 00:09:55.005819153 +0000 UTC m=+209.179525902" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.007410 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" event={"ID":"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a","Type":"ContainerStarted","Data":"20a0c91236d538b3fa1745ca92b3a27af099486933c6d89f56542107be20f542"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.008634 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" event={"ID":"10eead56-2e9b-4d48-ab81-d1638b3cdddc","Type":"ContainerStarted","Data":"2462c46e1785c092405d62542e8d0b00b229bb8e630d5e0207a2f5dcf487a459"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.009543 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" event={"ID":"9a1c67e3-f6df-4b4d-b3a3-669503580446","Type":"ContainerStarted","Data":"03ebf9cde09f85323fd76e933846dfc1dfd8ba1c198723c3174424b6bd6a1144"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.010616 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" event={"ID":"29f559d0-b505-4855-91e3-e46804b0c9f1","Type":"ContainerStarted","Data":"1fda82ed2dd75e4f1a15333c3f84108c6f138545e24043e618d968852d6e4eff"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.012369 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" event={"ID":"2204937d-9632-46e6-8f26-0cea8593d1a5","Type":"ContainerStarted","Data":"1560d54b45e99a47fa9666b7502cda6fe98263942b080cacdf734ec5737064c9"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.014094 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" event={"ID":"eb21f66e-5c18-49bb-8146-8185434e7c2f","Type":"ContainerStarted","Data":"6a43d5b7a3bb741ebbd18604c53964ec33e95cda9a8f13a07191480dc1f1e5d5"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.015527 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" event={"ID":"018c45cc-8cfa-497b-b6cf-25b10c694c58","Type":"ContainerStarted","Data":"d50670bd2c632b0f5afa80cc22121a3bdfc8020046cf4990c915ae61736f1197"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.019216 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" event={"ID":"0ad3539f-9691-4344-9c7f-1b015c5e3b3d","Type":"ContainerStarted","Data":"dcaefeae6fdcccd19f234f83b7d51bb8fd068722145409530bbf2ffa137202bd"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.019261 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" event={"ID":"0ad3539f-9691-4344-9c7f-1b015c5e3b3d","Type":"ContainerStarted","Data":"c29635a43a370ddcef70193ceb4b233fcaa550295f8e461c5e67182d4585854b"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.024926 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" podStartSLOduration=163.024906391 podStartE2EDuration="2m43.024906391s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.015808122 +0000 UTC m=+209.189514891" watchObservedRunningTime="2026-03-10 00:09:55.024906391 +0000 UTC m=+209.198613140" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.025235 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5hvbc" event={"ID":"22c138b8-4431-4695-be3f-0ea008d21f30","Type":"ContainerStarted","Data":"42057946215ffa6c798c8b0b4ea49811de3193e1f9b262beec1c0d1cc4a4a037"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.025296 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5hvbc" event={"ID":"22c138b8-4431-4695-be3f-0ea008d21f30","Type":"ContainerStarted","Data":"05e3932997f8b46fbf348ee0934f66ab6c46ae09a1f9db2c21f99ea679745e81"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.028578 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" event={"ID":"f15954a6-2036-4c32-a8b6-bc8e227d0fcd","Type":"ContainerStarted","Data":"34d25c6a2ee4eeee418c3880eacd2750fa818e7f6c4a58ca8767f5904cb57411"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.035540 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" podStartSLOduration=163.035522506 podStartE2EDuration="2m43.035522506s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.033918726 +0000 UTC m=+209.207625485" watchObservedRunningTime="2026-03-10 00:09:55.035522506 +0000 UTC m=+209.209229255" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.037609 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" event={"ID":"0f2b43c0-a96d-4ea3-8d46-d6919aedf741","Type":"ContainerStarted","Data":"f1e1d34ce7a507537fd76dd423d8de4db96972c855db85d1fc738caf85d29e98"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.040007 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" event={"ID":"5448e53f-3b74-47f1-9b28-705f36fd6ea3","Type":"ContainerStarted","Data":"24c1fefc2882e1802747b7b3ebf67c3834e12e98460d60417f7a36bc60de9cb1"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.040083 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" event={"ID":"5448e53f-3b74-47f1-9b28-705f36fd6ea3","Type":"ContainerStarted","Data":"4604c46a6249354e89435fdced2bee20c5ff8f5966359db2287131c85a900ae2"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.041044 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" event={"ID":"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4","Type":"ContainerStarted","Data":"4f78110283db462476124633cbb5e70ba67655950f51a76567b14aad17ab4e79"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.043969 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" event={"ID":"1109e060-ef32-407d-8283-eba65e1d4eaa","Type":"ContainerStarted","Data":"24f1352c9336dd3ec53fc571a9de8e74008c187b3edad6056ea5e61514891ec0"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.047530 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" event={"ID":"a2bd9787-6df4-492a-8cab-18201a143385","Type":"ContainerStarted","Data":"892ce50aa95222df79ba6f81fa34311ba310427496cc4a27cb6a4523c8d16520"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.050943 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" event={"ID":"1e80a388-91b3-42f1-9ee2-70ab4850652d","Type":"ContainerStarted","Data":"daab938e9c2fed71fd9ddd4943fb96f2cc1ac97606c4f53965a269a0ac0033ef"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.052269 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-x6s5d" podStartSLOduration=164.052233624 podStartE2EDuration="2m44.052233624s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.051540477 +0000 UTC m=+209.225247246" watchObservedRunningTime="2026-03-10 00:09:55.052233624 +0000 UTC m=+209.225940373" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.054459 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" event={"ID":"fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d","Type":"ContainerStarted","Data":"f0af0f093c6ecb03be70c52a8245ae76e3b834b8e24a7ea4b7b1aa74d2c3d221"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.055746 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" event={"ID":"49f58ba5-3573-4894-a320-fcf4ca4e50f1","Type":"ContainerStarted","Data":"688973cae337fece4be2344fdc1fac73b3b7502d0d929e7d31d3ba244653b544"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.056011 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.057202 4994 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bkq7b container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.057241 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" podUID="49f58ba5-3573-4894-a320-fcf4ca4e50f1" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.057611 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8lrmb" event={"ID":"4fb67636-fcba-4975-a460-403cd6ee9c25","Type":"ContainerStarted","Data":"131f6f968be699b4510e1711ff70f7d98fd24e9b749c0ac094982fa64eb070f5"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.058021 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.060374 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.060409 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.065570 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.065936 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.565925057 +0000 UTC m=+209.739631806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.067558 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-47fkz" podStartSLOduration=6.067538328 podStartE2EDuration="6.067538328s" podCreationTimestamp="2026-03-10 00:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.064592584 +0000 UTC m=+209.238299343" watchObservedRunningTime="2026-03-10 00:09:55.067538328 +0000 UTC m=+209.241245077" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.070387 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wv4d4" event={"ID":"1377f73a-df08-4450-afa1-960e15891141","Type":"ContainerStarted","Data":"80b6d2bc1e285aeb1ca7ae7979928b3e2d6c3d1c1772bfc23e9a845a4b245bc5"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.071693 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" event={"ID":"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd","Type":"ContainerStarted","Data":"8746dfac88fe15ceee6b052e166cd28bfb74ffbe54cfe6f00bf09d8a12e889fd"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.073982 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.075579 4994 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tgf68 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.075622 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.076422 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" event={"ID":"4dbc1f03-d386-460b-81f5-e6b7d3630557","Type":"ContainerStarted","Data":"5394779b6c263bbb8c6c5314221b41c0f79a1573e13a61c85573645f35a787c2"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.078320 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" event={"ID":"46e9d10d-c8fb-45a4-87a3-4879dd3f5c4d","Type":"ContainerStarted","Data":"c7f42fde54c7c039a01a7870000533ca9c1566739cc9d63a752c5d7902d1cb89"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.079102 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" event={"ID":"7fd7640d-700a-420e-b15f-7f681090727b","Type":"ContainerStarted","Data":"14861c8d1aac432a04ec6fccb527599ebf8d5d4d4e7d239ed4bcca6f9ca85ce6"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.080504 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" event={"ID":"4c9cbda0-655c-4cf9-8f9a-23b3ebf37339","Type":"ContainerStarted","Data":"36fc7c2c9305493a786c31a0378a6b14c114abc68b4a313dee8e717fec27dc9f"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.082004 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" event={"ID":"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8","Type":"ContainerStarted","Data":"ce0e0fe899887c51b11ad46c4b9de66da2a57ba53ab4c60cd515a10d07afe0f7"} Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.082522 4994 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bkhqb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.082649 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" podUID="54ca6ee4-24c4-415f-a1b6-26f54e2992f8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.082733 4994 patch_prober.go:28] interesting pod/console-operator-58897d9998-pwpc6 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.082776 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" podUID="6ed61f01-8d13-4883-ac58-0e998df5c20d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.091176 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" podStartSLOduration=163.091152049 podStartE2EDuration="2m43.091152049s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.090272767 +0000 UTC m=+209.263979536" watchObservedRunningTime="2026-03-10 00:09:55.091152049 +0000 UTC m=+209.264858808" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.107215 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-8lrmb" podStartSLOduration=164.107195011 podStartE2EDuration="2m44.107195011s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.104731689 +0000 UTC m=+209.278438448" watchObservedRunningTime="2026-03-10 00:09:55.107195011 +0000 UTC m=+209.280901760" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.125337 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-hqlnc" podStartSLOduration=164.125312324 podStartE2EDuration="2m44.125312324s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.11752414 +0000 UTC m=+209.291230889" watchObservedRunningTime="2026-03-10 00:09:55.125312324 +0000 UTC m=+209.299019083" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.137465 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4xmwf" podStartSLOduration=164.137449049 podStartE2EDuration="2m44.137449049s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.135380887 +0000 UTC m=+209.309087636" watchObservedRunningTime="2026-03-10 00:09:55.137449049 +0000 UTC m=+209.311155798" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.155169 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zpr72" podStartSLOduration=164.155152712 podStartE2EDuration="2m44.155152712s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.154188818 +0000 UTC m=+209.327895577" watchObservedRunningTime="2026-03-10 00:09:55.155152712 +0000 UTC m=+209.328859461" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.166519 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.166701 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.66667821 +0000 UTC m=+209.840384959 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.167009 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.168273 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.66825499 +0000 UTC m=+209.841961749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.218855 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-m6jnx" podStartSLOduration=164.218827246 podStartE2EDuration="2m44.218827246s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.185314347 +0000 UTC m=+209.359021116" watchObservedRunningTime="2026-03-10 00:09:55.218827246 +0000 UTC m=+209.392534015" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.222215 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-vjj5j" podStartSLOduration=164.22220271 podStartE2EDuration="2m44.22220271s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.20219835 +0000 UTC m=+209.375905129" watchObservedRunningTime="2026-03-10 00:09:55.22220271 +0000 UTC m=+209.395909480" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.231618 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-725jp" podStartSLOduration=164.231591765 podStartE2EDuration="2m44.231591765s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.224238472 +0000 UTC m=+209.397945271" watchObservedRunningTime="2026-03-10 00:09:55.231591765 +0000 UTC m=+209.405298514" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.242510 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2m4gh" podStartSLOduration=164.242479078 podStartE2EDuration="2m44.242479078s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.240464168 +0000 UTC m=+209.414170917" watchObservedRunningTime="2026-03-10 00:09:55.242479078 +0000 UTC m=+209.416185827" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.255216 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" podStartSLOduration=164.255199447 podStartE2EDuration="2m44.255199447s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.253038953 +0000 UTC m=+209.426745712" watchObservedRunningTime="2026-03-10 00:09:55.255199447 +0000 UTC m=+209.428906196" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.269205 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.269601 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.769573577 +0000 UTC m=+209.943280386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.270011 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.271400 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.771388492 +0000 UTC m=+209.945095321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.276611 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mpvlf" podStartSLOduration=164.276586672 podStartE2EDuration="2m44.276586672s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.268816278 +0000 UTC m=+209.442523027" watchObservedRunningTime="2026-03-10 00:09:55.276586672 +0000 UTC m=+209.450293421" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.294649 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" podStartSLOduration=163.294630954 podStartE2EDuration="2m43.294630954s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.293919997 +0000 UTC m=+209.467626796" watchObservedRunningTime="2026-03-10 00:09:55.294630954 +0000 UTC m=+209.468337703" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.371258 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.371601 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.87157519 +0000 UTC m=+210.045281959 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.372677 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.373518 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.873507199 +0000 UTC m=+210.047213948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.473906 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.474748 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:55.974728333 +0000 UTC m=+210.148435082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.576609 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.577139 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.077100407 +0000 UTC m=+210.250807206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.679637 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.679935 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.17989469 +0000 UTC m=+210.353601439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.680703 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.681057 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.181045869 +0000 UTC m=+210.354752618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.782753 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.783115 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.283090514 +0000 UTC m=+210.456797263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.783351 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.783748 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.283739621 +0000 UTC m=+210.457446370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.850693 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.853098 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.853189 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.885174 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.885748 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.385714894 +0000 UTC m=+210.559421643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.886496 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.886998 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.386979025 +0000 UTC m=+210.560685774 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.987351 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.987491 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.487464681 +0000 UTC m=+210.661171460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:55 crc kubenswrapper[4994]: I0310 00:09:55.988228 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:55 crc kubenswrapper[4994]: E0310 00:09:55.988858 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.488828576 +0000 UTC m=+210.662535495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.089411 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.089559 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.589537167 +0000 UTC m=+210.763243916 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.089598 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.089992 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.589980108 +0000 UTC m=+210.763686857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.091375 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wv4d4" event={"ID":"1377f73a-df08-4450-afa1-960e15891141","Type":"ContainerStarted","Data":"f965e4bdb02543038bc4a981e562cddd7e381bb2d24ba9a040b99b6e1e92416d"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.092113 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-wv4d4" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.094977 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5" event={"ID":"3ffed56d-e2ab-4fa9-9dac-98c382395f2f","Type":"ContainerStarted","Data":"00929d6fc9c599b907598b3546f56ee130547c1395ea2c4cb9be0fe6da4999a6"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.096955 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29551680-sz8pz" event={"ID":"0779a70e-ebf5-4e98-87ea-43017b8d1e46","Type":"ContainerStarted","Data":"a4cfa8b96c6aa5123624fb879c0f68820a0d96a764fb960d5b7561f433ae5dad"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.098366 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" event={"ID":"28b5630a-9f96-453c-ac88-70d75b7d438d","Type":"ContainerStarted","Data":"ca2bf1116193219cdd988fc6185e54ae625be4d81126fe0d0819e83f847bd653"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.100787 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" event={"ID":"10eead56-2e9b-4d48-ab81-d1638b3cdddc","Type":"ContainerStarted","Data":"3e518552546dce36ddb039ca2542fbbaa1c3fe1c2b7ebbff223c196e516b986d"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.102893 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" event={"ID":"964eb9a7-a580-44d4-b5e5-fe84d085823c","Type":"ContainerStarted","Data":"4a6b7543dffaf97e57a78e1f116971619d8dd0a2bd6e5852c145f4ec4559ba38"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.104975 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" event={"ID":"5937dfbb-0da7-439c-94cb-e0e1f658d464","Type":"ContainerStarted","Data":"b2622b6e369b3bc1c122bb59b37130409b242aeb0a819b1f2e4fe178f09fd834"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.106210 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" event={"ID":"a2bd9787-6df4-492a-8cab-18201a143385","Type":"ContainerStarted","Data":"234cdec71c533bc815a74e577d193619fa75ef730b685db920e9eb073edb3fdb"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.108785 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wv4d4" podStartSLOduration=7.108768279 podStartE2EDuration="7.108768279s" podCreationTimestamp="2026-03-10 00:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.106803919 +0000 UTC m=+210.280510698" watchObservedRunningTime="2026-03-10 00:09:56.108768279 +0000 UTC m=+210.282475028" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.110328 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" event={"ID":"2204937d-9632-46e6-8f26-0cea8593d1a5","Type":"ContainerStarted","Data":"4d243ddd65462760dbee4f1651ddaa4809dfeb893cd7cdf23d37cbd6b811484f"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.111016 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.116072 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" event={"ID":"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21","Type":"ContainerStarted","Data":"b3e88a3c77c3dbe81c13b71d89325919a585d6d800ed9ee595d2dd5b462d8747"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.118703 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" event={"ID":"9a1c67e3-f6df-4b4d-b3a3-669503580446","Type":"ContainerStarted","Data":"399e02d23d05e73f7894f67ce3daf4adc1113c0ce6603248bb5fb54dd2e96ce0"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.123262 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fkpf5" podStartSLOduration=165.123230291 podStartE2EDuration="2m45.123230291s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.119469976 +0000 UTC m=+210.293176725" watchObservedRunningTime="2026-03-10 00:09:56.123230291 +0000 UTC m=+210.296937030" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.128744 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" event={"ID":"018c45cc-8cfa-497b-b6cf-25b10c694c58","Type":"ContainerStarted","Data":"97066324072ecc4daa4504d93999ce14cec65927cbefe353672ed493b9910920"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.132381 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" event={"ID":"e14d0dda-fe3a-4ea4-96e1-31ffc8c016e4","Type":"ContainerStarted","Data":"15bae0b264501d6183ac485bbda4701175856674d4622471a6212abc557f5d2a"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.135755 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" event={"ID":"1e80a388-91b3-42f1-9ee2-70ab4850652d","Type":"ContainerStarted","Data":"37b3d7d2c792686c78d21e85d376bbb0ce24db71e2ebde23bcca29f40317cd89"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.138307 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" event={"ID":"5448e53f-3b74-47f1-9b28-705f36fd6ea3","Type":"ContainerStarted","Data":"93d8e1d51b5bce53cefebc9c4a8bc10bc17991737eb9b549fabd5d8e6c562d61"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.143277 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-vh5ns" podStartSLOduration=164.143259372 podStartE2EDuration="2m44.143259372s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.139302422 +0000 UTC m=+210.313009171" watchObservedRunningTime="2026-03-10 00:09:56.143259372 +0000 UTC m=+210.316966131" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.145479 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" event={"ID":"2f6b37d7-cd0a-4e80-8fae-abc8ab7c748a","Type":"ContainerStarted","Data":"6e6a1a4eefe4c6e2cbbd5fc27ecdc34d61c889dd0df49be92721df7bed5fca11"} Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146083 4994 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tgf68 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146143 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146179 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146227 4994 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bkq7b container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146237 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146259 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" podUID="49f58ba5-3573-4894-a320-fcf4ca4e50f1" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146427 4994 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-c4cdv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" start-of-body= Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146458 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" podUID="5cc67063-d02f-4cb9-a15d-0d0a5c457e6e" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146722 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146771 4994 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-d8g97 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:5443/healthz\": dial tcp 10.217.0.17:5443: connect: connection refused" start-of-body= Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.146794 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" podUID="70d3076a-1af2-4aed-93ac-8dbbebd7e7d5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.17:5443/healthz\": dial tcp 10.217.0.17:5443: connect: connection refused" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.153134 4994 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xm7j6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.153183 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" podUID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.190472 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.192750 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.692728261 +0000 UTC m=+210.866435020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.195091 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.195426 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.695410127 +0000 UTC m=+210.869116886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.238653 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" podStartSLOduration=164.23863539 podStartE2EDuration="2m44.23863539s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.238110407 +0000 UTC m=+210.411817156" watchObservedRunningTime="2026-03-10 00:09:56.23863539 +0000 UTC m=+210.412342139" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.263771 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5t2rc" podStartSLOduration=165.263757919 podStartE2EDuration="2m45.263757919s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.26219617 +0000 UTC m=+210.435902919" watchObservedRunningTime="2026-03-10 00:09:56.263757919 +0000 UTC m=+210.437464668" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.303932 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.304228 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29551680-sz8pz" podStartSLOduration=165.304213892 podStartE2EDuration="2m45.304213892s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.303701369 +0000 UTC m=+210.477408118" watchObservedRunningTime="2026-03-10 00:09:56.304213892 +0000 UTC m=+210.477920641" Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.305260 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.805239528 +0000 UTC m=+210.978946277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.325765 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nf5dh" podStartSLOduration=164.325742431 podStartE2EDuration="2m44.325742431s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.323754891 +0000 UTC m=+210.497461650" watchObservedRunningTime="2026-03-10 00:09:56.325742431 +0000 UTC m=+210.499449180" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.349188 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" podStartSLOduration=165.349168847 podStartE2EDuration="2m45.349168847s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.348735127 +0000 UTC m=+210.522441886" watchObservedRunningTime="2026-03-10 00:09:56.349168847 +0000 UTC m=+210.522875596" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.370670 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" podStartSLOduration=165.370650075 podStartE2EDuration="2m45.370650075s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.36641091 +0000 UTC m=+210.540117669" watchObservedRunningTime="2026-03-10 00:09:56.370650075 +0000 UTC m=+210.544356874" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.405507 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lzcrv" podStartSLOduration=164.405486087 podStartE2EDuration="2m44.405486087s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.392265157 +0000 UTC m=+210.565971916" watchObservedRunningTime="2026-03-10 00:09:56.405486087 +0000 UTC m=+210.579192836" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.407341 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jkd8t" podStartSLOduration=165.407331394 podStartE2EDuration="2m45.407331394s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.404670117 +0000 UTC m=+210.578376866" watchObservedRunningTime="2026-03-10 00:09:56.407331394 +0000 UTC m=+210.581038143" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.409786 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.410180 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:56.910165915 +0000 UTC m=+211.083872664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.424079 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-966nr" podStartSLOduration=164.424061123 podStartE2EDuration="2m44.424061123s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.420287909 +0000 UTC m=+210.593994658" watchObservedRunningTime="2026-03-10 00:09:56.424061123 +0000 UTC m=+210.597767872" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.443224 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-pvfj5" podStartSLOduration=164.443207262 podStartE2EDuration="2m44.443207262s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.440585827 +0000 UTC m=+210.614292596" watchObservedRunningTime="2026-03-10 00:09:56.443207262 +0000 UTC m=+210.616914011" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.457514 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5hvbc" podStartSLOduration=7.45749843 podStartE2EDuration="7.45749843s" podCreationTimestamp="2026-03-10 00:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.456472085 +0000 UTC m=+210.630178834" watchObservedRunningTime="2026-03-10 00:09:56.45749843 +0000 UTC m=+210.631205169" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.507716 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xhzh4" podStartSLOduration=165.507698537 podStartE2EDuration="2m45.507698537s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.50583418 +0000 UTC m=+210.679540929" watchObservedRunningTime="2026-03-10 00:09:56.507698537 +0000 UTC m=+210.681405286" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.508915 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hmt98" podStartSLOduration=165.508908617 podStartE2EDuration="2m45.508908617s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.484829585 +0000 UTC m=+210.658536344" watchObservedRunningTime="2026-03-10 00:09:56.508908617 +0000 UTC m=+210.682615366" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.510239 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.510373 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.010358084 +0000 UTC m=+211.184064833 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.510470 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.510493 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.510534 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.510616 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.511287 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.011270766 +0000 UTC m=+211.184977515 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.522045 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.523108 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.538661 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.555601 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" podStartSLOduration=164.555584156 podStartE2EDuration="2m44.555584156s" podCreationTimestamp="2026-03-10 00:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.533489192 +0000 UTC m=+210.707195941" watchObservedRunningTime="2026-03-10 00:09:56.555584156 +0000 UTC m=+210.729290905" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.559091 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" podStartSLOduration=165.559084823 podStartE2EDuration="2m45.559084823s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:56.55375136 +0000 UTC m=+210.727458109" watchObservedRunningTime="2026-03-10 00:09:56.559084823 +0000 UTC m=+210.732791562" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.611494 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.611618 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.611676 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.612014 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.111987538 +0000 UTC m=+211.285694287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.616410 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.616620 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c125b3-4a9c-46a7-a468-54e93c44751d-metrics-certs\") pod \"network-metrics-daemon-vxjt2\" (UID: \"f4c125b3-4a9c-46a7-a468-54e93c44751d\") " pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.691598 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.712270 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.712674 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.212658949 +0000 UTC m=+211.386365698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.713643 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxjt2" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.726720 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.736683 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.787128 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.787830 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.791381 4994 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-xtfzl container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.12:8443/livez\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.791488 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" podUID="5937dfbb-0da7-439c-94cb-e0e1f658d464" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.12:8443/livez\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.812994 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.813197 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.313167725 +0000 UTC m=+211.486874474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.813456 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.813794 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.31378344 +0000 UTC m=+211.487490189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.853254 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.853356 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.914412 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.914600 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.414576244 +0000 UTC m=+211.588282993 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.914755 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:56 crc kubenswrapper[4994]: E0310 00:09:56.915148 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.415133078 +0000 UTC m=+211.588839827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:56 crc kubenswrapper[4994]: I0310 00:09:56.965233 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vxjt2"] Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.015560 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.015817 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.515800898 +0000 UTC m=+211.689507647 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: W0310 00:09:57.056158 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-8db223696e8ca4d3c62124625f1bdb16b634b5c0f256b76f7d08b7a96df3ed89 WatchSource:0}: Error finding container 8db223696e8ca4d3c62124625f1bdb16b634b5c0f256b76f7d08b7a96df3ed89: Status 404 returned error can't find the container with id 8db223696e8ca4d3c62124625f1bdb16b634b5c0f256b76f7d08b7a96df3ed89 Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.117926 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.118303 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.618288295 +0000 UTC m=+211.791995044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.154741 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8db223696e8ca4d3c62124625f1bdb16b634b5c0f256b76f7d08b7a96df3ed89"} Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.160477 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" event={"ID":"ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21","Type":"ContainerStarted","Data":"1b58b06db75022f3b55ce331168196e97a6d8615bab038fd3308b681b7b3452c"} Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.166701 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" event={"ID":"f4c125b3-4a9c-46a7-a468-54e93c44751d","Type":"ContainerStarted","Data":"1d5629d59e513cf0201f405d88fdc5c855901aca9fe25c7a180bf3d7ba725320"} Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.168314 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.168355 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.172982 4994 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xm7j6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.173007 4994 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tgf68 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.173027 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" podUID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.173056 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.198359 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" podStartSLOduration=166.198339809 podStartE2EDuration="2m46.198339809s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:57.180888292 +0000 UTC m=+211.354595041" watchObservedRunningTime="2026-03-10 00:09:57.198339809 +0000 UTC m=+211.372046558" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.200059 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-6sx4w" podStartSLOduration=166.200055472 podStartE2EDuration="2m46.200055472s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:57.197330114 +0000 UTC m=+211.371036863" watchObservedRunningTime="2026-03-10 00:09:57.200055472 +0000 UTC m=+211.373762221" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.219266 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.219367 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.719350365 +0000 UTC m=+211.893057114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.220739 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.222390 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.722380641 +0000 UTC m=+211.896087390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.321549 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.321787 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:57.82176656 +0000 UTC m=+211.995473309 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: W0310 00:09:57.322605 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-cfaf89a2969de096994cddd0555263d037fee8a7bf136c9d71487a28ab94a863 WatchSource:0}: Error finding container cfaf89a2969de096994cddd0555263d037fee8a7bf136c9d71487a28ab94a863: Status 404 returned error can't find the container with id cfaf89a2969de096994cddd0555263d037fee8a7bf136c9d71487a28ab94a863 Mar 10 00:09:57 crc kubenswrapper[4994]: W0310 00:09:57.333040 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-8abc1f17c56089b934df9306ad8ce987a405b1b8f04741ab0124507546ed2e63 WatchSource:0}: Error finding container 8abc1f17c56089b934df9306ad8ce987a405b1b8f04741ab0124507546ed2e63: Status 404 returned error can't find the container with id 8abc1f17c56089b934df9306ad8ce987a405b1b8f04741ab0124507546ed2e63 Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.603506 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.604080 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.104058897 +0000 UTC m=+212.277765677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.705224 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.705423 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.205390874 +0000 UTC m=+212.379097613 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.706099 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.706981 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.206961564 +0000 UTC m=+212.380668353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.773097 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.775019 4994 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-mzg85 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.775094 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" podUID="4c9cbda0-655c-4cf9-8f9a-23b3ebf37339" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.775749 4994 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-mzg85 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.775803 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" podUID="4c9cbda0-655c-4cf9-8f9a-23b3ebf37339" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.775857 4994 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-mzg85 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.775890 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" podUID="4c9cbda0-655c-4cf9-8f9a-23b3ebf37339" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.807336 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.807572 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.307531622 +0000 UTC m=+212.481238371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.807716 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.808329 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.308310652 +0000 UTC m=+212.482017401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.852700 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.852764 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 10 00:09:57 crc kubenswrapper[4994]: I0310 00:09:57.909724 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:57 crc kubenswrapper[4994]: E0310 00:09:57.910213 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.410189893 +0000 UTC m=+212.583896642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.010475 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.011338 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.511313065 +0000 UTC m=+212.685019814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.111475 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.111766 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.61174849 +0000 UTC m=+212.785455239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.190672 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3f33b6c56cfac1967a4dcbddceca032158c67699506b06995e597ac4b64027e4"} Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.192755 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0f13ea4e720f3258d8b7a44cbb5496b8925efb6c962c67c6dccfa890d7fb97d6"} Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.192844 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cfaf89a2969de096994cddd0555263d037fee8a7bf136c9d71487a28ab94a863"} Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.193830 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" event={"ID":"f4c125b3-4a9c-46a7-a468-54e93c44751d","Type":"ContainerStarted","Data":"683f739c9fae78029d3e208484563aaa46773767208f7226c53323f5c7fc2207"} Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.195241 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"eebe108915a33f04c0847174595c1bed992e476de8a176720fc247e2ae933044"} Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.195276 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8abc1f17c56089b934df9306ad8ce987a405b1b8f04741ab0124507546ed2e63"} Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.218538 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.219818 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.719801175 +0000 UTC m=+212.893508124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.319465 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.321182 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.819714517 +0000 UTC m=+212.993421266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.321269 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.321652 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.821630784 +0000 UTC m=+212.995337533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.422354 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.422543 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.92251644 +0000 UTC m=+213.096223189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.422601 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.422921 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:58.922909391 +0000 UTC m=+213.096616140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.526169 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.526517 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.026472603 +0000 UTC m=+213.200179352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.526693 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.527043 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.027027187 +0000 UTC m=+213.200733936 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.627651 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.628061 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.128000555 +0000 UTC m=+213.301707304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.628213 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.628668 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.128657392 +0000 UTC m=+213.302364141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.729170 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.729365 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.229333592 +0000 UTC m=+213.403040351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.729446 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.729714 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.229702181 +0000 UTC m=+213.403408930 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.830666 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.830849 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.330819973 +0000 UTC m=+213.504526722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.831025 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.831358 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.331346246 +0000 UTC m=+213.505052995 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.866525 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:58 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:09:58 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:09:58 crc kubenswrapper[4994]: healthz check failed Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.866599 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:58 crc kubenswrapper[4994]: I0310 00:09:58.932682 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:58 crc kubenswrapper[4994]: E0310 00:09:58.933067 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.433031463 +0000 UTC m=+213.606738212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.034926 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.035327 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.535313283 +0000 UTC m=+213.709020032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.130610 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59690: no serving certificate available for the kubelet" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.136214 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.136557 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.636542108 +0000 UTC m=+213.810248857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.194817 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59702: no serving certificate available for the kubelet" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.200749 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vxjt2" event={"ID":"f4c125b3-4a9c-46a7-a468-54e93c44751d","Type":"ContainerStarted","Data":"ea7e1f47de27d1f230c1087175c3b1b836fe0f70fdf5858eac2cae561d8c7863"} Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.220224 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59706: no serving certificate available for the kubelet" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.237647 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.238401 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.738388898 +0000 UTC m=+213.912095647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.297528 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59714: no serving certificate available for the kubelet" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.321198 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.322011 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.333073 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.339146 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.339597 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3d610d6-85f4-43b2-a597-4955431daa70-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c3d610d6-85f4-43b2-a597-4955431daa70\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.339628 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.339659 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3d610d6-85f4-43b2-a597-4955431daa70-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c3d610d6-85f4-43b2-a597-4955431daa70\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.339860 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.839780557 +0000 UTC m=+214.013487316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.350488 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.399142 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59730: no serving certificate available for the kubelet" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.416520 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59732: no serving certificate available for the kubelet" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.441040 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3d610d6-85f4-43b2-a597-4955431daa70-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c3d610d6-85f4-43b2-a597-4955431daa70\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.441077 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.441113 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3d610d6-85f4-43b2-a597-4955431daa70-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c3d610d6-85f4-43b2-a597-4955431daa70\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.441465 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3d610d6-85f4-43b2-a597-4955431daa70-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c3d610d6-85f4-43b2-a597-4955431daa70\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.441528 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:59.941511884 +0000 UTC m=+214.115218633 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.461780 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3d610d6-85f4-43b2-a597-4955431daa70-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c3d610d6-85f4-43b2-a597-4955431daa70\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.541901 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.542097 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.042077682 +0000 UTC m=+214.215784431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.542290 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.542680 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.042669746 +0000 UTC m=+214.216376495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.562108 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59734: no serving certificate available for the kubelet" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.643737 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.644205 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.144174379 +0000 UTC m=+214.317881168 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.650675 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.747961 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.748844 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.248833079 +0000 UTC m=+214.422539828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.850071 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.850340 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.350304159 +0000 UTC m=+214.524010908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.850399 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.850934 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.350916904 +0000 UTC m=+214.524623643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.856537 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:59 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:09:59 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:09:59 crc kubenswrapper[4994]: healthz check failed Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.856611 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.858320 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 00:09:59 crc kubenswrapper[4994]: W0310 00:09:59.871077 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc3d610d6_85f4_43b2_a597_4955431daa70.slice/crio-31dddbc250d328a5341968e6887f3a959acf990551da79bebde9d9af83c538be WatchSource:0}: Error finding container 31dddbc250d328a5341968e6887f3a959acf990551da79bebde9d9af83c538be: Status 404 returned error can't find the container with id 31dddbc250d328a5341968e6887f3a959acf990551da79bebde9d9af83c538be Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.940763 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59742: no serving certificate available for the kubelet" Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.951917 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.952074 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.452048496 +0000 UTC m=+214.625755245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:59 crc kubenswrapper[4994]: I0310 00:09:59.952178 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:09:59 crc kubenswrapper[4994]: E0310 00:09:59.952499 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.452489518 +0000 UTC m=+214.626196267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.052863 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.052976 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.552955003 +0000 UTC m=+214.726661752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.053152 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.053471 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.553448856 +0000 UTC m=+214.727169325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.129275 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551690-7rbl8"] Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.130268 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551690-7rbl8" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.135609 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551690-7rbl8"] Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.154717 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.154921 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.654904226 +0000 UTC m=+214.828610975 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.154959 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97ch2\" (UniqueName: \"kubernetes.io/projected/f04aae5d-b067-4e49-82f3-66412ec1bba6-kube-api-access-97ch2\") pod \"auto-csr-approver-29551690-7rbl8\" (UID: \"f04aae5d-b067-4e49-82f3-66412ec1bba6\") " pod="openshift-infra/auto-csr-approver-29551690-7rbl8" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.155016 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.155247 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.655241024 +0000 UTC m=+214.828947773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.210606 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c3d610d6-85f4-43b2-a597-4955431daa70","Type":"ContainerStarted","Data":"31dddbc250d328a5341968e6887f3a959acf990551da79bebde9d9af83c538be"} Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.228195 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vxjt2" podStartSLOduration=169.22817835 podStartE2EDuration="2m49.22817835s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:00.226734615 +0000 UTC m=+214.400441364" watchObservedRunningTime="2026-03-10 00:10:00.22817835 +0000 UTC m=+214.401885099" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.256338 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.256545 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97ch2\" (UniqueName: \"kubernetes.io/projected/f04aae5d-b067-4e49-82f3-66412ec1bba6-kube-api-access-97ch2\") pod \"auto-csr-approver-29551690-7rbl8\" (UID: \"f04aae5d-b067-4e49-82f3-66412ec1bba6\") " pod="openshift-infra/auto-csr-approver-29551690-7rbl8" Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.257331 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.75731723 +0000 UTC m=+214.931023979 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.292295 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97ch2\" (UniqueName: \"kubernetes.io/projected/f04aae5d-b067-4e49-82f3-66412ec1bba6-kube-api-access-97ch2\") pod \"auto-csr-approver-29551690-7rbl8\" (UID: \"f04aae5d-b067-4e49-82f3-66412ec1bba6\") " pod="openshift-infra/auto-csr-approver-29551690-7rbl8" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.358413 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.358921 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.858854613 +0000 UTC m=+215.032561362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.442051 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551690-7rbl8" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.459279 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.459410 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.95938125 +0000 UTC m=+215.133087999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.459582 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.459819 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:00.95981241 +0000 UTC m=+215.133519159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.562456 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.562998 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.062972614 +0000 UTC m=+215.236679393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.629196 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59746: no serving certificate available for the kubelet" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.664663 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.664953 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.164942656 +0000 UTC m=+215.338649405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.680505 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551690-7rbl8"] Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.765517 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.765970 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.265950455 +0000 UTC m=+215.439657204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.768536 4994 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-mzg85 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.768585 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" podUID="4c9cbda0-655c-4cf9-8f9a-23b3ebf37339" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.768627 4994 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-mzg85 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.768669 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" podUID="4c9cbda0-655c-4cf9-8f9a-23b3ebf37339" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.854024 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:00 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:00 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:00 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.854098 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.867610 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.868073 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.368053391 +0000 UTC m=+215.541760140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.968474 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.968684 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.46865375 +0000 UTC m=+215.642360509 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.968744 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:00 crc kubenswrapper[4994]: E0310 00:10:00.969117 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.469104132 +0000 UTC m=+215.642810881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.973241 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.973843 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.976728 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.976954 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 00:10:00 crc kubenswrapper[4994]: I0310 00:10:00.984282 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.069821 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.070067 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2a178b03-e81c-47af-898a-0463f964e327-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2a178b03-e81c-47af-898a-0463f964e327\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.070179 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2a178b03-e81c-47af-898a-0463f964e327-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2a178b03-e81c-47af-898a-0463f964e327\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.070329 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.570310696 +0000 UTC m=+215.744017455 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.171180 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2a178b03-e81c-47af-898a-0463f964e327-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2a178b03-e81c-47af-898a-0463f964e327\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.171270 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2a178b03-e81c-47af-898a-0463f964e327-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2a178b03-e81c-47af-898a-0463f964e327\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.171296 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.171412 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2a178b03-e81c-47af-898a-0463f964e327-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2a178b03-e81c-47af-898a-0463f964e327\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.171553 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.671542311 +0000 UTC m=+215.845249060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.217510 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" event={"ID":"7fd7640d-700a-420e-b15f-7f681090727b","Type":"ContainerStarted","Data":"e23e4a004f421713fe5580c3caf54896d2c54a646b5ca0dd5f920ecb04055cc5"} Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.220643 4994 generic.go:334] "Generic (PLEG): container finished" podID="0f2b43c0-a96d-4ea3-8d46-d6919aedf741" containerID="f1e1d34ce7a507537fd76dd423d8de4db96972c855db85d1fc738caf85d29e98" exitCode=0 Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.220709 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" event={"ID":"0f2b43c0-a96d-4ea3-8d46-d6919aedf741","Type":"ContainerDied","Data":"f1e1d34ce7a507537fd76dd423d8de4db96972c855db85d1fc738caf85d29e98"} Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.222634 4994 generic.go:334] "Generic (PLEG): container finished" podID="c3d610d6-85f4-43b2-a597-4955431daa70" containerID="c633aedd351850595d229a68c4652520ab947d1ecb51a46d4cd387b12bdf57bf" exitCode=0 Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.222679 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c3d610d6-85f4-43b2-a597-4955431daa70","Type":"ContainerDied","Data":"c633aedd351850595d229a68c4652520ab947d1ecb51a46d4cd387b12bdf57bf"} Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.272122 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.272294 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.772265953 +0000 UTC m=+215.945972702 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.272414 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.272695 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.772688823 +0000 UTC m=+215.946395572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.283341 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2a178b03-e81c-47af-898a-0463f964e327-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2a178b03-e81c-47af-898a-0463f964e327\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.288063 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.373685 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.373905 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.873863236 +0000 UTC m=+216.047569985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.374299 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.374626 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.874617815 +0000 UTC m=+216.048324564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.457435 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xm7j6"] Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.457670 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" podUID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" containerName="controller-manager" containerID="cri-o://ce0e0fe899887c51b11ad46c4b9de66da2a57ba53ab4c60cd515a10d07afe0f7" gracePeriod=30 Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.458326 4994 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xm7j6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.458415 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" podUID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.475546 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.475689 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.975667265 +0000 UTC m=+216.149374014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.475796 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.476084 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:01.976070535 +0000 UTC m=+216.149777284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.505336 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb"] Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.505554 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" podUID="54ca6ee4-24c4-415f-a1b6-26f54e2992f8" containerName="route-controller-manager" containerID="cri-o://0f1267926bcca137db3abcb72a6c709ffad1b8249211d418fa61e8ee79ffda76" gracePeriod=30 Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.508978 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.577546 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.577917 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.077901465 +0000 UTC m=+216.251608214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.660067 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.679587 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.680544 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.180531154 +0000 UTC m=+216.354238013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.740205 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-pwpc6" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.747144 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.747190 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.748505 4994 patch_prober.go:28] interesting pod/apiserver-76f77b778f-lxxqb container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.19:8443/livez\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.748557 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" podUID="ac5d8ed1-6cb6-4ed6-b0b4-7a2c795dbb21" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.19:8443/livez\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.780977 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.781198 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.281172084 +0000 UTC m=+216.454878833 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.781297 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.781587 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.281579785 +0000 UTC m=+216.455286534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.872961 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:01 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:01 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:01 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.873034 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.879821 4994 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bkhqb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.879892 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" podUID="54ca6ee4-24c4-415f-a1b6-26f54e2992f8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.883328 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.883642 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.38362762 +0000 UTC m=+216.557334369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.911369 4994 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xm7j6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.911425 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" podUID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.971478 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59758: no serving certificate available for the kubelet" Mar 10 00:10:01 crc kubenswrapper[4994]: I0310 00:10:01.984388 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:01 crc kubenswrapper[4994]: E0310 00:10:01.984746 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.484730831 +0000 UTC m=+216.658437580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.064561 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.085935 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.086116 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.586090599 +0000 UTC m=+216.759797348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.086361 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.086741 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.586728555 +0000 UTC m=+216.760435304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.161385 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c4cdv" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.185758 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.185808 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.188969 4994 patch_prober.go:28] interesting pod/console-f9d7485db-rlqtz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.189029 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rlqtz" podUID="11b78073-cc4a-4a6f-89ab-631fde4b3371" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.189226 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.189346 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.689325574 +0000 UTC m=+216.863032333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.189533 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.190240 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.690229586 +0000 UTC m=+216.863936335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.230672 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d8g97" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.250043 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-879f6c89f-xm7j6_51cdf794-a18c-4a6f-a3ef-a07f03ce95a8/controller-manager/0.log" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.250095 4994 generic.go:334] "Generic (PLEG): container finished" podID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" containerID="ce0e0fe899887c51b11ad46c4b9de66da2a57ba53ab4c60cd515a10d07afe0f7" exitCode=2 Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.250205 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" event={"ID":"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8","Type":"ContainerDied","Data":"ce0e0fe899887c51b11ad46c4b9de66da2a57ba53ab4c60cd515a10d07afe0f7"} Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.295424 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.295591 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.795567184 +0000 UTC m=+216.969273933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.295906 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.297152 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.797141914 +0000 UTC m=+216.970848753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.306406 4994 generic.go:334] "Generic (PLEG): container finished" podID="54ca6ee4-24c4-415f-a1b6-26f54e2992f8" containerID="0f1267926bcca137db3abcb72a6c709ffad1b8249211d418fa61e8ee79ffda76" exitCode=0 Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.306653 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" event={"ID":"54ca6ee4-24c4-415f-a1b6-26f54e2992f8","Type":"ContainerDied","Data":"0f1267926bcca137db3abcb72a6c709ffad1b8249211d418fa61e8ee79ffda76"} Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.349464 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bkq7b" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.397433 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.398324 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.898300997 +0000 UTC m=+217.072007796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.442944 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bwzk5"] Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.444094 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.446707 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.498766 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-utilities\") pod \"certified-operators-bwzk5\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.498800 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-catalog-content\") pod \"certified-operators-bwzk5\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.498829 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l95st\" (UniqueName: \"kubernetes.io/projected/fdad0261-804d-41dc-8a25-48018f136c0f-kube-api-access-l95st\") pod \"certified-operators-bwzk5\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.498906 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.499154 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:02.999142361 +0000 UTC m=+217.172849110 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.499676 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.499707 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.499825 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.499885 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.505187 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bwzk5"] Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.600241 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.600464 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-utilities\") pod \"certified-operators-bwzk5\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.600493 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-catalog-content\") pod \"certified-operators-bwzk5\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.600523 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l95st\" (UniqueName: \"kubernetes.io/projected/fdad0261-804d-41dc-8a25-48018f136c0f-kube-api-access-l95st\") pod \"certified-operators-bwzk5\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.600646 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.100615502 +0000 UTC m=+217.274322251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.601450 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-catalog-content\") pod \"certified-operators-bwzk5\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.601541 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-utilities\") pod \"certified-operators-bwzk5\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.627622 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l95st\" (UniqueName: \"kubernetes.io/projected/fdad0261-804d-41dc-8a25-48018f136c0f-kube-api-access-l95st\") pod \"certified-operators-bwzk5\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.639547 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.644143 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zv2kt"] Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.645065 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.650366 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.660351 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xtfzl" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.672289 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zv2kt"] Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.702144 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-utilities\") pod \"community-operators-zv2kt\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.702193 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.702239 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-catalog-content\") pod \"community-operators-zv2kt\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.702262 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lx2s\" (UniqueName: \"kubernetes.io/projected/76aa065c-ed60-4237-b36f-5ce2865256ff-kube-api-access-5lx2s\") pod \"community-operators-zv2kt\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.703239 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.203228182 +0000 UTC m=+217.376934921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.757257 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.803473 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.803594 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.303574623 +0000 UTC m=+217.477281372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.803978 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-utilities\") pod \"community-operators-zv2kt\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.804418 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-utilities\") pod \"community-operators-zv2kt\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.804493 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.804790 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.304780724 +0000 UTC m=+217.478487473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.804999 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-catalog-content\") pod \"community-operators-zv2kt\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.805296 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-catalog-content\") pod \"community-operators-zv2kt\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.805347 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lx2s\" (UniqueName: \"kubernetes.io/projected/76aa065c-ed60-4237-b36f-5ce2865256ff-kube-api-access-5lx2s\") pod \"community-operators-zv2kt\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.837740 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c4tz9"] Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.839016 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.842716 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lx2s\" (UniqueName: \"kubernetes.io/projected/76aa065c-ed60-4237-b36f-5ce2865256ff-kube-api-access-5lx2s\") pod \"community-operators-zv2kt\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.843048 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c4tz9"] Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.851915 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.854686 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:02 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:02 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:02 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.854725 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.908316 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.908581 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-catalog-content\") pod \"certified-operators-c4tz9\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.908691 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfh8s\" (UniqueName: \"kubernetes.io/projected/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-kube-api-access-xfh8s\") pod \"certified-operators-c4tz9\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.908728 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-utilities\") pod \"certified-operators-c4tz9\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:02 crc kubenswrapper[4994]: E0310 00:10:02.909496 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.409481725 +0000 UTC m=+217.583188474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:02 crc kubenswrapper[4994]: I0310 00:10:02.998396 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.010212 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-catalog-content\") pod \"certified-operators-c4tz9\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.010272 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.010378 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfh8s\" (UniqueName: \"kubernetes.io/projected/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-kube-api-access-xfh8s\") pod \"certified-operators-c4tz9\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.010418 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-utilities\") pod \"certified-operators-c4tz9\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.010709 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-catalog-content\") pod \"certified-operators-c4tz9\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.010961 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-utilities\") pod \"certified-operators-c4tz9\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.011010 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.510994917 +0000 UTC m=+217.684701736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.031793 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfh8s\" (UniqueName: \"kubernetes.io/projected/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-kube-api-access-xfh8s\") pod \"certified-operators-c4tz9\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.040037 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s7qcn"] Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.041542 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.062361 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s7qcn"] Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.111388 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.111571 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.611546364 +0000 UTC m=+217.785253113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.111614 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-catalog-content\") pod \"community-operators-s7qcn\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.111689 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.111842 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-utilities\") pod \"community-operators-s7qcn\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.111969 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fbrm\" (UniqueName: \"kubernetes.io/projected/abe30cce-8379-4db8-838b-f48b4bc96621-kube-api-access-2fbrm\") pod \"community-operators-s7qcn\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.112538 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.612523849 +0000 UTC m=+217.786230598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.169453 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.213313 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.213443 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.713418336 +0000 UTC m=+217.887125085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.213663 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-catalog-content\") pod \"community-operators-s7qcn\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.213938 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.213987 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-utilities\") pod \"community-operators-s7qcn\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.214026 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fbrm\" (UniqueName: \"kubernetes.io/projected/abe30cce-8379-4db8-838b-f48b4bc96621-kube-api-access-2fbrm\") pod \"community-operators-s7qcn\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.214121 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-catalog-content\") pod \"community-operators-s7qcn\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.214929 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-utilities\") pod \"community-operators-s7qcn\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.215811 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.715511787 +0000 UTC m=+217.889218566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.236175 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fbrm\" (UniqueName: \"kubernetes.io/projected/abe30cce-8379-4db8-838b-f48b4bc96621-kube-api-access-2fbrm\") pod \"community-operators-s7qcn\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.315035 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.315618 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.815586903 +0000 UTC m=+217.989293702 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.370254 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.417418 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.417764 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:03.917750542 +0000 UTC m=+218.091457291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.519268 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.519407 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.019381016 +0000 UTC m=+218.193087785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.521947 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.522436 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.022418972 +0000 UTC m=+218.196125721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.623395 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.623530 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.123504183 +0000 UTC m=+218.297210922 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.623736 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.624409 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.124363274 +0000 UTC m=+218.298070033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.725426 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.725625 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.225598499 +0000 UTC m=+218.399305248 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.725820 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.726143 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.226134893 +0000 UTC m=+218.399841642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.773584 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mzg85" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.827106 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.827332 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.327303766 +0000 UTC m=+218.501010525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.827452 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.827752 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.327740137 +0000 UTC m=+218.501446886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.855404 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:03 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:03 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:03 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.855466 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:03 crc kubenswrapper[4994]: I0310 00:10:03.928618 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:03 crc kubenswrapper[4994]: E0310 00:10:03.928904 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.428889969 +0000 UTC m=+218.602596718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.030694 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.031247 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.531221822 +0000 UTC m=+218.704928571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.131392 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.131674 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.631621905 +0000 UTC m=+218.805328654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.166185 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wv4d4" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.233374 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.235451 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.735432134 +0000 UTC m=+218.909138903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.334313 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.334529 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.834501145 +0000 UTC m=+219.008207894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.334636 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.335097 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.835078039 +0000 UTC m=+219.008784798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.435819 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.436018 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.935982036 +0000 UTC m=+219.109688825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.436072 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.436357 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:04.936344805 +0000 UTC m=+219.110051554 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.537292 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.537501 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.037471387 +0000 UTC m=+219.211178146 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.537567 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.537983 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.037971949 +0000 UTC m=+219.211678708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.568303 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59762: no serving certificate available for the kubelet" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.636482 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hrh9x"] Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.637505 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.638323 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.638494 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.138464315 +0000 UTC m=+219.312171064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.638666 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.639033 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.13901525 +0000 UTC m=+219.312721999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.643193 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.651058 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrh9x"] Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.739298 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.739479 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.239441293 +0000 UTC m=+219.413148042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.739680 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.739717 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btt9c\" (UniqueName: \"kubernetes.io/projected/a4a4dc2d-502f-4c05-ab76-1cc708f13006-kube-api-access-btt9c\") pod \"redhat-marketplace-hrh9x\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.739751 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-catalog-content\") pod \"redhat-marketplace-hrh9x\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.739828 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-utilities\") pod \"redhat-marketplace-hrh9x\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.740206 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.240195933 +0000 UTC m=+219.413902862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.841345 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.841560 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.34152979 +0000 UTC m=+219.515236539 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.841626 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-catalog-content\") pod \"redhat-marketplace-hrh9x\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.841713 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-utilities\") pod \"redhat-marketplace-hrh9x\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.842006 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.842076 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btt9c\" (UniqueName: \"kubernetes.io/projected/a4a4dc2d-502f-4c05-ab76-1cc708f13006-kube-api-access-btt9c\") pod \"redhat-marketplace-hrh9x\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.842143 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-catalog-content\") pod \"redhat-marketplace-hrh9x\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.842715 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.342698739 +0000 UTC m=+219.516405578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.843152 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-utilities\") pod \"redhat-marketplace-hrh9x\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.854340 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:04 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:04 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:04 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.854421 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.878171 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btt9c\" (UniqueName: \"kubernetes.io/projected/a4a4dc2d-502f-4c05-ab76-1cc708f13006-kube-api-access-btt9c\") pod \"redhat-marketplace-hrh9x\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.893846 4994 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 00:10:04 crc kubenswrapper[4994]: I0310 00:10:04.943201 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:04 crc kubenswrapper[4994]: E0310 00:10:04.943429 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.443414681 +0000 UTC m=+219.617121430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.000661 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.045630 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bzrd2"] Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.046373 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.046937 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.546917512 +0000 UTC m=+219.720624271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.048573 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.061355 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzrd2"] Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.149240 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.149414 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.649387348 +0000 UTC m=+219.823094097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.150540 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.150758 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdkv4\" (UniqueName: \"kubernetes.io/projected/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-kube-api-access-fdkv4\") pod \"redhat-marketplace-bzrd2\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.150858 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.650850685 +0000 UTC m=+219.824557434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.151269 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-utilities\") pod \"redhat-marketplace-bzrd2\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.151468 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-catalog-content\") pod \"redhat-marketplace-bzrd2\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.252598 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.252743 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.752724796 +0000 UTC m=+219.926431545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.252835 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-utilities\") pod \"redhat-marketplace-bzrd2\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.252886 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-catalog-content\") pod \"redhat-marketplace-bzrd2\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.252954 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.252980 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdkv4\" (UniqueName: \"kubernetes.io/projected/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-kube-api-access-fdkv4\") pod \"redhat-marketplace-bzrd2\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.253307 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.75329062 +0000 UTC m=+219.926997369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.253500 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-catalog-content\") pod \"redhat-marketplace-bzrd2\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.254232 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-utilities\") pod \"redhat-marketplace-bzrd2\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.268866 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdkv4\" (UniqueName: \"kubernetes.io/projected/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-kube-api-access-fdkv4\") pod \"redhat-marketplace-bzrd2\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.354177 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.354339 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.854309869 +0000 UTC m=+220.028016628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.356540 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.356850 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.856838803 +0000 UTC m=+220.030545572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.380890 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.457939 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.458076 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.958058047 +0000 UTC m=+220.131764796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.458937 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.459240 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:05.959230826 +0000 UTC m=+220.132937565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.559941 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.560078 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.06003464 +0000 UTC m=+220.233741389 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.560169 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.560576 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.060560824 +0000 UTC m=+220.234267573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.633183 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wpd8k"] Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.634824 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.636812 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.648529 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wpd8k"] Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.661073 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.662044 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.162020954 +0000 UTC m=+220.335727703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.763205 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-utilities\") pod \"redhat-operators-wpd8k\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.763293 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.763347 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-catalog-content\") pod \"redhat-operators-wpd8k\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.763448 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpl2h\" (UniqueName: \"kubernetes.io/projected/6525b40b-1c23-4533-a025-4d86bc406f00-kube-api-access-xpl2h\") pod \"redhat-operators-wpd8k\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.764233 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.264218072 +0000 UTC m=+220.437924911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.854226 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:05 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:05 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:05 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.854298 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.866547 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.870443 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.370407551 +0000 UTC m=+220.544114320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.871567 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-utilities\") pod \"redhat-operators-wpd8k\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.871642 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.871676 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-catalog-content\") pod \"redhat-operators-wpd8k\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.871732 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpl2h\" (UniqueName: \"kubernetes.io/projected/6525b40b-1c23-4533-a025-4d86bc406f00-kube-api-access-xpl2h\") pod \"redhat-operators-wpd8k\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.872150 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-utilities\") pod \"redhat-operators-wpd8k\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.872338 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.372304639 +0000 UTC m=+220.546011428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.872387 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-catalog-content\") pod \"redhat-operators-wpd8k\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.898123 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpl2h\" (UniqueName: \"kubernetes.io/projected/6525b40b-1c23-4533-a025-4d86bc406f00-kube-api-access-xpl2h\") pod \"redhat-operators-wpd8k\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.953390 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:10:05 crc kubenswrapper[4994]: I0310 00:10:05.972686 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:05 crc kubenswrapper[4994]: E0310 00:10:05.973341 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.473266327 +0000 UTC m=+220.646973076 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.032654 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t5kj4"] Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.033713 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.045762 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t5kj4"] Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.074848 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-catalog-content\") pod \"redhat-operators-t5kj4\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.075105 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.075153 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-utilities\") pod \"redhat-operators-t5kj4\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.075185 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frwzt\" (UniqueName: \"kubernetes.io/projected/0429fae4-1356-4d61-86a3-267f74f27636-kube-api-access-frwzt\") pod \"redhat-operators-t5kj4\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.075449 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.575438664 +0000 UTC m=+220.749145413 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.117192 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59750: no serving certificate available for the kubelet" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.176084 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.176322 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.676286649 +0000 UTC m=+220.849993478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.176395 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-catalog-content\") pod \"redhat-operators-t5kj4\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.176531 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.176701 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-utilities\") pod \"redhat-operators-t5kj4\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.176814 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frwzt\" (UniqueName: \"kubernetes.io/projected/0429fae4-1356-4d61-86a3-267f74f27636-kube-api-access-frwzt\") pod \"redhat-operators-t5kj4\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.176832 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-catalog-content\") pod \"redhat-operators-t5kj4\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.176964 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.676951127 +0000 UTC m=+220.850657886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.177129 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-utilities\") pod \"redhat-operators-t5kj4\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.197647 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frwzt\" (UniqueName: \"kubernetes.io/projected/0429fae4-1356-4d61-86a3-267f74f27636-kube-api-access-frwzt\") pod \"redhat-operators-t5kj4\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.277823 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.278155 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.778098609 +0000 UTC m=+220.951805368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.278270 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.279032 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.779020692 +0000 UTC m=+220.952727441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.355675 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.380208 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.381131 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.881098278 +0000 UTC m=+221.054805067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.481689 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.482046 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:06.982034755 +0000 UTC m=+221.155741504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.588752 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.589371 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.089348432 +0000 UTC m=+221.263055191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.690092 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.690443 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.190427703 +0000 UTC m=+221.364134452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.737844 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.751842 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.764973 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-lxxqb" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.792078 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.792324 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.292302983 +0000 UTC m=+221.466009732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.792360 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.792629 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.292622541 +0000 UTC m=+221.466329290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.845372 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.855890 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.856054 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:06 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:06 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:06 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.856081 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.898742 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx7fb\" (UniqueName: \"kubernetes.io/projected/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-kube-api-access-fx7fb\") pod \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.898831 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-secret-volume\") pod \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.898923 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-config-volume\") pod \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\" (UID: \"0f2b43c0-a96d-4ea3-8d46-d6919aedf741\") " Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.899058 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.908241 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-config-volume" (OuterVolumeSpecName: "config-volume") pod "0f2b43c0-a96d-4ea3-8d46-d6919aedf741" (UID: "0f2b43c0-a96d-4ea3-8d46-d6919aedf741"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:06 crc kubenswrapper[4994]: E0310 00:10:06.908343 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.408328209 +0000 UTC m=+221.582034958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.910184 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-kube-api-access-fx7fb" (OuterVolumeSpecName: "kube-api-access-fx7fb") pod "0f2b43c0-a96d-4ea3-8d46-d6919aedf741" (UID: "0f2b43c0-a96d-4ea3-8d46-d6919aedf741"). InnerVolumeSpecName "kube-api-access-fx7fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:06 crc kubenswrapper[4994]: I0310 00:10:06.919501 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0f2b43c0-a96d-4ea3-8d46-d6919aedf741" (UID: "0f2b43c0-a96d-4ea3-8d46-d6919aedf741"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.000563 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3d610d6-85f4-43b2-a597-4955431daa70-kubelet-dir\") pod \"c3d610d6-85f4-43b2-a597-4955431daa70\" (UID: \"c3d610d6-85f4-43b2-a597-4955431daa70\") " Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.000686 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3d610d6-85f4-43b2-a597-4955431daa70-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c3d610d6-85f4-43b2-a597-4955431daa70" (UID: "c3d610d6-85f4-43b2-a597-4955431daa70"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.000713 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3d610d6-85f4-43b2-a597-4955431daa70-kube-api-access\") pod \"c3d610d6-85f4-43b2-a597-4955431daa70\" (UID: \"c3d610d6-85f4-43b2-a597-4955431daa70\") " Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.001144 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.001230 4994 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3d610d6-85f4-43b2-a597-4955431daa70-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.001246 4994 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.001258 4994 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.001272 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx7fb\" (UniqueName: \"kubernetes.io/projected/0f2b43c0-a96d-4ea3-8d46-d6919aedf741-kube-api-access-fx7fb\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.001512 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.501500362 +0000 UTC m=+221.675207111 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.006053 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3d610d6-85f4-43b2-a597-4955431daa70-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c3d610d6-85f4-43b2-a597-4955431daa70" (UID: "c3d610d6-85f4-43b2-a597-4955431daa70"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.101952 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.102160 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.602134061 +0000 UTC m=+221.775840810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.102518 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.102582 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3d610d6-85f4-43b2-a597-4955431daa70-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.102963 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.602945981 +0000 UTC m=+221.776652730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.203218 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.203424 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.703373686 +0000 UTC m=+221.877080435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.203464 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.203968 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.703957351 +0000 UTC m=+221.877664100 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.304996 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.305187 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.805155764 +0000 UTC m=+221.978862513 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.305349 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.305790 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.805695518 +0000 UTC m=+221.979402267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.334696 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c3d610d6-85f4-43b2-a597-4955431daa70","Type":"ContainerDied","Data":"31dddbc250d328a5341968e6887f3a959acf990551da79bebde9d9af83c538be"} Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.334741 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31dddbc250d328a5341968e6887f3a959acf990551da79bebde9d9af83c538be" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.334809 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.338402 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551690-7rbl8" event={"ID":"f04aae5d-b067-4e49-82f3-66412ec1bba6","Type":"ContainerStarted","Data":"4311de40639f67ea8a55d45863ea9b8bade3cfa62b21c735dd8d526f7c5e805a"} Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.340094 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" event={"ID":"0f2b43c0-a96d-4ea3-8d46-d6919aedf741","Type":"ContainerDied","Data":"bae537ffa0b0b80c06d0b79407ab1e4733786fdf150560f197e315922a963b90"} Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.340130 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bae537ffa0b0b80c06d0b79407ab1e4733786fdf150560f197e315922a963b90" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.340201 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-lqhzx" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.407266 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.407544 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.907508697 +0000 UTC m=+222.081215476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.407745 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.408418 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:07.90840581 +0000 UTC m=+222.082112549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.508466 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.508671 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.00864311 +0000 UTC m=+222.182349889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.508982 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.509370 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.009349947 +0000 UTC m=+222.183056706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.611322 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.611522 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.111493555 +0000 UTC m=+222.285200314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.611654 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.612537 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.11250968 +0000 UTC m=+222.286216469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.712859 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.713205 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.21316497 +0000 UTC m=+222.386871759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.713252 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.713726 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.213709524 +0000 UTC m=+222.387416313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.814625 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.814774 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.314748934 +0000 UTC m=+222.488455683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.815025 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.815316 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.315307617 +0000 UTC m=+222.489014366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.864283 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:07 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:07 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:07 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.864356 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.891605 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.893963 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-879f6c89f-xm7j6_51cdf794-a18c-4a6f-a3ef-a07f03ce95a8/controller-manager/0.log" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.894013 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.916419 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.916568 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.416540083 +0000 UTC m=+222.590246832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:07 crc kubenswrapper[4994]: I0310 00:10:07.916737 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:07 crc kubenswrapper[4994]: E0310 00:10:07.917065 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.417052516 +0000 UTC m=+222.590759265 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.018960 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxm5s\" (UniqueName: \"kubernetes.io/projected/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-kube-api-access-rxm5s\") pod \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.019008 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-serving-cert\") pod \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.019024 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thbrs\" (UniqueName: \"kubernetes.io/projected/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-kube-api-access-thbrs\") pod \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.019048 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-proxy-ca-bundles\") pod \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.019237 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.019265 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-client-ca\") pod \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.019282 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-client-ca\") pod \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.019310 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-serving-cert\") pod \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.019329 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-config\") pod \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\" (UID: \"54ca6ee4-24c4-415f-a1b6-26f54e2992f8\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.019356 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-config\") pod \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\" (UID: \"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8\") " Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.020374 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-config" (OuterVolumeSpecName: "config") pod "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" (UID: "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.020450 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.520438094 +0000 UTC m=+222.694144843 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.022291 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-config" (OuterVolumeSpecName: "config") pod "54ca6ee4-24c4-415f-a1b6-26f54e2992f8" (UID: "54ca6ee4-24c4-415f-a1b6-26f54e2992f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.021970 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-client-ca" (OuterVolumeSpecName: "client-ca") pod "54ca6ee4-24c4-415f-a1b6-26f54e2992f8" (UID: "54ca6ee4-24c4-415f-a1b6-26f54e2992f8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.024981 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-kube-api-access-rxm5s" (OuterVolumeSpecName: "kube-api-access-rxm5s") pod "54ca6ee4-24c4-415f-a1b6-26f54e2992f8" (UID: "54ca6ee4-24c4-415f-a1b6-26f54e2992f8"). InnerVolumeSpecName "kube-api-access-rxm5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.025282 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" (UID: "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.025575 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-client-ca" (OuterVolumeSpecName: "client-ca") pod "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" (UID: "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.026843 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" (UID: "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.027138 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "54ca6ee4-24c4-415f-a1b6-26f54e2992f8" (UID: "54ca6ee4-24c4-415f-a1b6-26f54e2992f8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.048899 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-kube-api-access-thbrs" (OuterVolumeSpecName: "kube-api-access-thbrs") pod "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" (UID: "51cdf794-a18c-4a6f-a3ef-a07f03ce95a8"). InnerVolumeSpecName "kube-api-access-thbrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121351 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121506 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121523 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121536 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121547 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxm5s\" (UniqueName: \"kubernetes.io/projected/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-kube-api-access-rxm5s\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121558 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121571 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thbrs\" (UniqueName: \"kubernetes.io/projected/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-kube-api-access-thbrs\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121582 4994 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121593 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54ca6ee4-24c4-415f-a1b6-26f54e2992f8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.121603 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.122054 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.622038178 +0000 UTC m=+222.795744927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.224067 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.235579 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.735548719 +0000 UTC m=+222.909255468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.327397 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.327695 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.827684186 +0000 UTC m=+223.001390935 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.346154 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" event={"ID":"54ca6ee4-24c4-415f-a1b6-26f54e2992f8","Type":"ContainerDied","Data":"68e7ebb7b9a0fa967b84de70c209836439efd368a03ea8c0304dd46c8d9878be"} Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.346202 4994 scope.go:117] "RemoveContainer" containerID="0f1267926bcca137db3abcb72a6c709ffad1b8249211d418fa61e8ee79ffda76" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.346323 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.353334 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-879f6c89f-xm7j6_51cdf794-a18c-4a6f-a3ef-a07f03ce95a8/controller-manager/0.log" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.353390 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" event={"ID":"51cdf794-a18c-4a6f-a3ef-a07f03ce95a8","Type":"ContainerDied","Data":"e29a59b07d62557e79f131725545f7bbc14a1ca6dcc0ac4661855d156c889001"} Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.353459 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xm7j6" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.372222 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb"] Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.385754 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bkhqb"] Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.395061 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xm7j6"] Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.399145 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xm7j6"] Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.428920 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.429456 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:08.929436785 +0000 UTC m=+223.103143534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.530861 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.531239 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.031216893 +0000 UTC m=+223.204923652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.565281 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" path="/var/lib/kubelet/pods/51cdf794-a18c-4a6f-a3ef-a07f03ce95a8/volumes" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.566004 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54ca6ee4-24c4-415f-a1b6-26f54e2992f8" path="/var/lib/kubelet/pods/54ca6ee4-24c4-415f-a1b6-26f54e2992f8/volumes" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.631692 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.632057 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.132014857 +0000 UTC m=+223.305721646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.733774 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.734113 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.234098612 +0000 UTC m=+223.407805351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.835012 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.835173 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.335153232 +0000 UTC m=+223.508860001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.835212 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.835514 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.335506512 +0000 UTC m=+223.509213261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.854546 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:08 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:08 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:08 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.854622 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.936122 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.936272 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.436248264 +0000 UTC m=+223.609955013 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:08 crc kubenswrapper[4994]: I0310 00:10:08.936489 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:08 crc kubenswrapper[4994]: E0310 00:10:08.936750 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.436739326 +0000 UTC m=+223.610446075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.042937 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.043238 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.543200182 +0000 UTC m=+223.716906971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.043286 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.043747 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.543734645 +0000 UTC m=+223.717441394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.144219 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.144428 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.644395846 +0000 UTC m=+223.818102595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.144639 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.144972 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.6449585 +0000 UTC m=+223.818665259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.246008 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.246282 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.746255476 +0000 UTC m=+223.919962225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.246350 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.246640 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.746627725 +0000 UTC m=+223.920334474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.347847 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.348054 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.848028405 +0000 UTC m=+224.021735154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.348111 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.348393 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.848381333 +0000 UTC m=+224.022088072 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.449817 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.450594 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:09.950569001 +0000 UTC m=+224.124275780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.465129 4994 scope.go:117] "RemoveContainer" containerID="ce0e0fe899887c51b11ad46c4b9de66da2a57ba53ab4c60cd515a10d07afe0f7" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.552517 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.553003 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.052987446 +0000 UTC m=+224.226694195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.633708 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg"] Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.634033 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ca6ee4-24c4-415f-a1b6-26f54e2992f8" containerName="route-controller-manager" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.634052 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ca6ee4-24c4-415f-a1b6-26f54e2992f8" containerName="route-controller-manager" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.634064 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2b43c0-a96d-4ea3-8d46-d6919aedf741" containerName="collect-profiles" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.634070 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2b43c0-a96d-4ea3-8d46-d6919aedf741" containerName="collect-profiles" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.634077 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3d610d6-85f4-43b2-a597-4955431daa70" containerName="pruner" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.634083 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d610d6-85f4-43b2-a597-4955431daa70" containerName="pruner" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.634106 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" containerName="controller-manager" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.634112 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" containerName="controller-manager" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.634215 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3d610d6-85f4-43b2-a597-4955431daa70" containerName="pruner" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.634234 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="51cdf794-a18c-4a6f-a3ef-a07f03ce95a8" containerName="controller-manager" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.634249 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ca6ee4-24c4-415f-a1b6-26f54e2992f8" containerName="route-controller-manager" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.634262 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f2b43c0-a96d-4ea3-8d46-d6919aedf741" containerName="collect-profiles" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.634678 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.637963 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.638216 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.638381 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.638499 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.638604 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.640112 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.642529 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5c67ff489b-x49rf"] Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.643228 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.644215 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.644458 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.645167 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.645528 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.645710 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.645836 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.650033 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.653811 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.654121 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.154106588 +0000 UTC m=+224.327813337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.660382 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c67ff489b-x49rf"] Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.666436 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg"] Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.716898 4994 ???:1] "http: TLS handshake error from 192.168.126.11:59752: no serving certificate available for the kubelet" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.755736 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-proxy-ca-bundles\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.755804 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.755831 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51ce0bbc-ee87-47f6-be5d-24f40386cb60-serving-cert\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.755887 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-config\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.755911 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-client-ca\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.755966 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa106de9-72a4-4364-a10d-2ec2c543afcf-serving-cert\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.756132 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vb2m\" (UniqueName: \"kubernetes.io/projected/51ce0bbc-ee87-47f6-be5d-24f40386cb60-kube-api-access-6vb2m\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.756148 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-client-ca\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.756162 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-config\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.756223 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7jgf\" (UniqueName: \"kubernetes.io/projected/aa106de9-72a4-4364-a10d-2ec2c543afcf-kube-api-access-c7jgf\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.756564 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.256552832 +0000 UTC m=+224.430259581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.860179 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.860662 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-client-ca\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.860724 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vb2m\" (UniqueName: \"kubernetes.io/projected/51ce0bbc-ee87-47f6-be5d-24f40386cb60-kube-api-access-6vb2m\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.860755 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa106de9-72a4-4364-a10d-2ec2c543afcf-serving-cert\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.860785 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-client-ca\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.860810 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-config\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.860865 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7jgf\" (UniqueName: \"kubernetes.io/projected/aa106de9-72a4-4364-a10d-2ec2c543afcf-kube-api-access-c7jgf\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.860923 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-proxy-ca-bundles\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.860974 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51ce0bbc-ee87-47f6-be5d-24f40386cb60-serving-cert\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.861005 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-config\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.861337 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:09 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:09 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:09 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.861455 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.862357 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-config\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.863081 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.363052079 +0000 UTC m=+224.536758828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.863739 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-config\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.867487 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-client-ca\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.873760 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-client-ca\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.884440 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa106de9-72a4-4364-a10d-2ec2c543afcf-serving-cert\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.884555 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51ce0bbc-ee87-47f6-be5d-24f40386cb60-serving-cert\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.895116 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-proxy-ca-bundles\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.895613 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vb2m\" (UniqueName: \"kubernetes.io/projected/51ce0bbc-ee87-47f6-be5d-24f40386cb60-kube-api-access-6vb2m\") pod \"route-controller-manager-6df8f76c79-nrqgg\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.905110 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7jgf\" (UniqueName: \"kubernetes.io/projected/aa106de9-72a4-4364-a10d-2ec2c543afcf-kube-api-access-c7jgf\") pod \"controller-manager-5c67ff489b-x49rf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:09 crc kubenswrapper[4994]: I0310 00:10:09.962089 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:09 crc kubenswrapper[4994]: E0310 00:10:09.962502 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.462480989 +0000 UTC m=+224.636187738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.011264 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.016244 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.019510 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrh9x"] Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.040901 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c4tz9"] Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.051080 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t5kj4"] Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.053595 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzrd2"] Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.063491 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.064020 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.56399942 +0000 UTC m=+224.737706189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.076179 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s7qcn"] Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.164996 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.165342 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.665324947 +0000 UTC m=+224.839031686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.257693 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.264999 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zv2kt"] Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.265513 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.265672 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.76565125 +0000 UTC m=+224.939357999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.265850 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.266142 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.766132691 +0000 UTC m=+224.939839440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.278018 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bwzk5"] Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.281545 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wpd8k"] Mar 10 00:10:10 crc kubenswrapper[4994]: W0310 00:10:10.340783 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4a4dc2d_502f_4c05_ab76_1cc708f13006.slice/crio-5863f96f41db6fc401ecb7000e3f5a6cfef96961acdd0f3f461004c58668116e WatchSource:0}: Error finding container 5863f96f41db6fc401ecb7000e3f5a6cfef96961acdd0f3f461004c58668116e: Status 404 returned error can't find the container with id 5863f96f41db6fc401ecb7000e3f5a6cfef96961acdd0f3f461004c58668116e Mar 10 00:10:10 crc kubenswrapper[4994]: W0310 00:10:10.341885 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab6cd76f_6272_4fcd_8c75_3040c45ef1b5.slice/crio-7734e5951b02bb3a0a46ea5a16ee396269ee4c95d8725567063c864699e319c0 WatchSource:0}: Error finding container 7734e5951b02bb3a0a46ea5a16ee396269ee4c95d8725567063c864699e319c0: Status 404 returned error can't find the container with id 7734e5951b02bb3a0a46ea5a16ee396269ee4c95d8725567063c864699e319c0 Mar 10 00:10:10 crc kubenswrapper[4994]: W0310 00:10:10.342621 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0429fae4_1356_4d61_86a3_267f74f27636.slice/crio-c31a76fcba6e0a2edf574624c6292a3f51560169f1f1fa309a2ea336e40231d4 WatchSource:0}: Error finding container c31a76fcba6e0a2edf574624c6292a3f51560169f1f1fa309a2ea336e40231d4: Status 404 returned error can't find the container with id c31a76fcba6e0a2edf574624c6292a3f51560169f1f1fa309a2ea336e40231d4 Mar 10 00:10:10 crc kubenswrapper[4994]: W0310 00:10:10.350537 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabe30cce_8379_4db8_838b_f48b4bc96621.slice/crio-a14572119f65e2d0fbfc63101455582a2b9abfe6948028817dc155c8d3a9c7ab WatchSource:0}: Error finding container a14572119f65e2d0fbfc63101455582a2b9abfe6948028817dc155c8d3a9c7ab: Status 404 returned error can't find the container with id a14572119f65e2d0fbfc63101455582a2b9abfe6948028817dc155c8d3a9c7ab Mar 10 00:10:10 crc kubenswrapper[4994]: W0310 00:10:10.350799 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2a178b03_e81c_47af_898a_0463f964e327.slice/crio-c9aac967fb18b6328851a9397e77955184a1e3f93d91c3163e98cfc3de1758cd WatchSource:0}: Error finding container c9aac967fb18b6328851a9397e77955184a1e3f93d91c3163e98cfc3de1758cd: Status 404 returned error can't find the container with id c9aac967fb18b6328851a9397e77955184a1e3f93d91c3163e98cfc3de1758cd Mar 10 00:10:10 crc kubenswrapper[4994]: W0310 00:10:10.351757 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76aa065c_ed60_4237_b36f_5ce2865256ff.slice/crio-3e81512696f04f227cf371ddcf1556e047699059d617fb7f43f9cba658930f7f WatchSource:0}: Error finding container 3e81512696f04f227cf371ddcf1556e047699059d617fb7f43f9cba658930f7f: Status 404 returned error can't find the container with id 3e81512696f04f227cf371ddcf1556e047699059d617fb7f43f9cba658930f7f Mar 10 00:10:10 crc kubenswrapper[4994]: W0310 00:10:10.357016 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdad0261_804d_41dc_8a25_48018f136c0f.slice/crio-97695118e06c708e4423796822ac54cea80fe3fc3b7289e71d5f6ac300dfeb72 WatchSource:0}: Error finding container 97695118e06c708e4423796822ac54cea80fe3fc3b7289e71d5f6ac300dfeb72: Status 404 returned error can't find the container with id 97695118e06c708e4423796822ac54cea80fe3fc3b7289e71d5f6ac300dfeb72 Mar 10 00:10:10 crc kubenswrapper[4994]: W0310 00:10:10.357427 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6525b40b_1c23_4533_a025_4d86bc406f00.slice/crio-45f9adf9166a73c98a80bad9d037f2560042ccbdeec2aa18a7cbfa8528d64c9a WatchSource:0}: Error finding container 45f9adf9166a73c98a80bad9d037f2560042ccbdeec2aa18a7cbfa8528d64c9a: Status 404 returned error can't find the container with id 45f9adf9166a73c98a80bad9d037f2560042ccbdeec2aa18a7cbfa8528d64c9a Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.367962 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.368489 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.868452474 +0000 UTC m=+225.042159223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.377664 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7qcn" event={"ID":"abe30cce-8379-4db8-838b-f48b4bc96621","Type":"ContainerStarted","Data":"a14572119f65e2d0fbfc63101455582a2b9abfe6948028817dc155c8d3a9c7ab"} Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.378950 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2a178b03-e81c-47af-898a-0463f964e327","Type":"ContainerStarted","Data":"c9aac967fb18b6328851a9397e77955184a1e3f93d91c3163e98cfc3de1758cd"} Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.380025 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5kj4" event={"ID":"0429fae4-1356-4d61-86a3-267f74f27636","Type":"ContainerStarted","Data":"c31a76fcba6e0a2edf574624c6292a3f51560169f1f1fa309a2ea336e40231d4"} Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.381115 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zv2kt" event={"ID":"76aa065c-ed60-4237-b36f-5ce2865256ff","Type":"ContainerStarted","Data":"3e81512696f04f227cf371ddcf1556e047699059d617fb7f43f9cba658930f7f"} Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.382288 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrh9x" event={"ID":"a4a4dc2d-502f-4c05-ab76-1cc708f13006","Type":"ContainerStarted","Data":"5863f96f41db6fc401ecb7000e3f5a6cfef96961acdd0f3f461004c58668116e"} Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.383292 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwzk5" event={"ID":"fdad0261-804d-41dc-8a25-48018f136c0f","Type":"ContainerStarted","Data":"97695118e06c708e4423796822ac54cea80fe3fc3b7289e71d5f6ac300dfeb72"} Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.384316 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpd8k" event={"ID":"6525b40b-1c23-4533-a025-4d86bc406f00","Type":"ContainerStarted","Data":"45f9adf9166a73c98a80bad9d037f2560042ccbdeec2aa18a7cbfa8528d64c9a"} Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.385390 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4tz9" event={"ID":"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5","Type":"ContainerStarted","Data":"7734e5951b02bb3a0a46ea5a16ee396269ee4c95d8725567063c864699e319c0"} Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.387515 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzrd2" event={"ID":"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c","Type":"ContainerStarted","Data":"d1b32d28a2daabcbb6951ddc2404e012b74605f090a8de0ccde979112a9da8a3"} Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.469276 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.469703 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:10.969687878 +0000 UTC m=+225.143394637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.573441 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.573850 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.073833046 +0000 UTC m=+225.247539795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.674603 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.674913 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.174901506 +0000 UTC m=+225.348608255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.776021 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.776132 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.27611558 +0000 UTC m=+225.449822329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.776308 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.776574 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.276566092 +0000 UTC m=+225.450272841 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.854949 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:10 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:10 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:10 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.855012 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.877543 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.877796 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.377747716 +0000 UTC m=+225.551454475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.878027 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.878481 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.378470214 +0000 UTC m=+225.552176963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.978825 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.979064 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.479022691 +0000 UTC m=+225.652729440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:10 crc kubenswrapper[4994]: I0310 00:10:10.979552 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:10 crc kubenswrapper[4994]: E0310 00:10:10.980036 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.480028076 +0000 UTC m=+225.653734825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.080591 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.081108 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.581092976 +0000 UTC m=+225.754799725 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.182490 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.182894 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.682862064 +0000 UTC m=+225.856568813 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.226275 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c67ff489b-x49rf"] Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.276358 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg"] Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.284008 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.284173 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.784145681 +0000 UTC m=+225.957852430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.284222 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.284665 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.784653494 +0000 UTC m=+225.958360243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.385281 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.385656 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.885640722 +0000 UTC m=+226.059347471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.398996 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" event={"ID":"51ce0bbc-ee87-47f6-be5d-24f40386cb60","Type":"ContainerStarted","Data":"b99811bd76278a20c75ea5a5530b5792fd876cafc8ee3f721f73cedbdb0b24d7"} Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.403703 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-87hn7_9a1c67e3-f6df-4b4d-b3a3-669503580446/cluster-samples-operator/0.log" Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.403794 4994 generic.go:334] "Generic (PLEG): container finished" podID="9a1c67e3-f6df-4b4d-b3a3-669503580446" containerID="03ebf9cde09f85323fd76e933846dfc1dfd8ba1c198723c3174424b6bd6a1144" exitCode=2 Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.403858 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" event={"ID":"9a1c67e3-f6df-4b4d-b3a3-669503580446","Type":"ContainerDied","Data":"03ebf9cde09f85323fd76e933846dfc1dfd8ba1c198723c3174424b6bd6a1144"} Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.404553 4994 scope.go:117] "RemoveContainer" containerID="03ebf9cde09f85323fd76e933846dfc1dfd8ba1c198723c3174424b6bd6a1144" Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.407582 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7qcn" event={"ID":"abe30cce-8379-4db8-838b-f48b4bc96621","Type":"ContainerStarted","Data":"b12c2570f0f12ececa7d019201ed8ccc106ee186110fc56077d24c5532fccef4"} Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.415677 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" event={"ID":"aa106de9-72a4-4364-a10d-2ec2c543afcf","Type":"ContainerStarted","Data":"8a78cdbee32124e2065f39d9cf54d4202c2bf87b8ec1b372bda9861fe5ee8d02"} Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.486703 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.487296 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:11.987278947 +0000 UTC m=+226.160985756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.588745 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.589027 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.088987473 +0000 UTC m=+226.262694232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.589114 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.589415 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.089401964 +0000 UTC m=+226.263108713 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.690675 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.690971 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.190946196 +0000 UTC m=+226.364652945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.691184 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.692709 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.19269974 +0000 UTC m=+226.366406489 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.792533 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.792807 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.292789696 +0000 UTC m=+226.466496445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.793183 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.793547 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.293539725 +0000 UTC m=+226.467246474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.855041 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:11 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:11 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:11 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.855366 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.894750 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.895342 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.395311323 +0000 UTC m=+226.569018082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:11 crc kubenswrapper[4994]: I0310 00:10:11.996098 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:11 crc kubenswrapper[4994]: E0310 00:10:11.996661 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.49664629 +0000 UTC m=+226.670353039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.098196 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:12 crc kubenswrapper[4994]: E0310 00:10:12.099157 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.599141587 +0000 UTC m=+226.772848336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.184290 4994 patch_prober.go:28] interesting pod/console-f9d7485db-rlqtz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.184766 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rlqtz" podUID="11b78073-cc4a-4a6f-89ab-631fde4b3371" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.200224 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:12 crc kubenswrapper[4994]: E0310 00:10:12.200654 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.700643958 +0000 UTC m=+226.874350707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.301212 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:12 crc kubenswrapper[4994]: E0310 00:10:12.301516 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.801465273 +0000 UTC m=+226.975172032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.301593 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:12 crc kubenswrapper[4994]: E0310 00:10:12.301957 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.801942274 +0000 UTC m=+226.975649023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.402643 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:12 crc kubenswrapper[4994]: E0310 00:10:12.402907 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.902887952 +0000 UTC m=+227.076594701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.403016 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:12 crc kubenswrapper[4994]: E0310 00:10:12.403321 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:12.903310022 +0000 UTC m=+227.077016771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.426854 4994 generic.go:334] "Generic (PLEG): container finished" podID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerID="b4775c7cc3dc93dde45a7cd1c8d5a247763c1a3b907807a2f3655ae2194f4c42" exitCode=0 Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.426990 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4tz9" event={"ID":"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5","Type":"ContainerDied","Data":"b4775c7cc3dc93dde45a7cd1c8d5a247763c1a3b907807a2f3655ae2194f4c42"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.430282 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-87hn7_9a1c67e3-f6df-4b4d-b3a3-669503580446/cluster-samples-operator/0.log" Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.430469 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-87hn7" event={"ID":"9a1c67e3-f6df-4b4d-b3a3-669503580446","Type":"ContainerStarted","Data":"0a65910acc46447be87b6e67907b65f26084070a5be8a0b720524a4f076a1bbe"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.433481 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2a178b03-e81c-47af-898a-0463f964e327","Type":"ContainerStarted","Data":"d5e688aeef6f62de6c564f71497e2a41fce184df58655c98c950861c2322f5d8"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.435031 4994 generic.go:334] "Generic (PLEG): container finished" podID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerID="f5717a500fcfc936aa966df3e8984d98f5ff5ab90d17718d04d543deea170e1a" exitCode=0 Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.435100 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrh9x" event={"ID":"a4a4dc2d-502f-4c05-ab76-1cc708f13006","Type":"ContainerDied","Data":"f5717a500fcfc936aa966df3e8984d98f5ff5ab90d17718d04d543deea170e1a"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.436925 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" event={"ID":"aa106de9-72a4-4364-a10d-2ec2c543afcf","Type":"ContainerStarted","Data":"eead1cddbea92663bf5592f78a9fd2d9a4a50429baaf76e2d49ea3e824e4b343"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.437640 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.439644 4994 generic.go:334] "Generic (PLEG): container finished" podID="6525b40b-1c23-4533-a025-4d86bc406f00" containerID="aae194d8e4c12d216b3165e539102c03919767ba8d6987e2d169c5147eb55863" exitCode=0 Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.439701 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpd8k" event={"ID":"6525b40b-1c23-4533-a025-4d86bc406f00","Type":"ContainerDied","Data":"aae194d8e4c12d216b3165e539102c03919767ba8d6987e2d169c5147eb55863"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.443700 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.444754 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" event={"ID":"51ce0bbc-ee87-47f6-be5d-24f40386cb60","Type":"ContainerStarted","Data":"d0af519df7d4889c7f4e2a422bbfa5a0aa335234246b0389b4653db0004e1db2"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.444939 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.458119 4994 generic.go:334] "Generic (PLEG): container finished" podID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerID="80f8634f7b8323c210693d621de8d8f6643dfa095b77ff7b2c7b90894cebf6e9" exitCode=0 Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.458246 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzrd2" event={"ID":"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c","Type":"ContainerDied","Data":"80f8634f7b8323c210693d621de8d8f6643dfa095b77ff7b2c7b90894cebf6e9"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.462505 4994 generic.go:334] "Generic (PLEG): container finished" podID="0429fae4-1356-4d61-86a3-267f74f27636" containerID="5520b611519111e180e88d1153308daf75771503d82a595371d6519dba75f44f" exitCode=0 Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.462652 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5kj4" event={"ID":"0429fae4-1356-4d61-86a3-267f74f27636","Type":"ContainerDied","Data":"5520b611519111e180e88d1153308daf75771503d82a595371d6519dba75f44f"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.468385 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.473060 4994 generic.go:334] "Generic (PLEG): container finished" podID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerID="ca43f34122075e40b3a59998e1c0fdcc5eee5438f96f373d4f9e4b36228204ee" exitCode=0 Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.473376 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zv2kt" event={"ID":"76aa065c-ed60-4237-b36f-5ce2865256ff","Type":"ContainerDied","Data":"ca43f34122075e40b3a59998e1c0fdcc5eee5438f96f373d4f9e4b36228204ee"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.478645 4994 generic.go:334] "Generic (PLEG): container finished" podID="abe30cce-8379-4db8-838b-f48b4bc96621" containerID="b12c2570f0f12ececa7d019201ed8ccc106ee186110fc56077d24c5532fccef4" exitCode=0 Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.478730 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7qcn" event={"ID":"abe30cce-8379-4db8-838b-f48b4bc96621","Type":"ContainerDied","Data":"b12c2570f0f12ececa7d019201ed8ccc106ee186110fc56077d24c5532fccef4"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.489989 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=12.489970212 podStartE2EDuration="12.489970212s" podCreationTimestamp="2026-03-10 00:10:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:12.486034624 +0000 UTC m=+226.659741373" watchObservedRunningTime="2026-03-10 00:10:12.489970212 +0000 UTC m=+226.663676961" Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.490091 4994 generic.go:334] "Generic (PLEG): container finished" podID="fdad0261-804d-41dc-8a25-48018f136c0f" containerID="d9c63ca86073ed9073e0f89d99a6a3af753621532c12e53bb512c115a6852ded" exitCode=0 Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.490123 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwzk5" event={"ID":"fdad0261-804d-41dc-8a25-48018f136c0f","Type":"ContainerDied","Data":"d9c63ca86073ed9073e0f89d99a6a3af753621532c12e53bb512c115a6852ded"} Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.499486 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.499529 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.499537 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.499587 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.507435 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:12 crc kubenswrapper[4994]: E0310 00:10:12.507640 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:13.007621334 +0000 UTC m=+227.181328083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.507812 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:12 crc kubenswrapper[4994]: E0310 00:10:12.508241 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:13.00822333 +0000 UTC m=+227.181930089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:12 crc kubenswrapper[4994]: I0310 00:10:12.588488 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" podStartSLOduration=11.588475928 podStartE2EDuration="11.588475928s" podCreationTimestamp="2026-03-10 00:10:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:12.585684078 +0000 UTC m=+226.759390827" watchObservedRunningTime="2026-03-10 00:10:12.588475928 +0000 UTC m=+226.762182677" Mar 10 00:10:13 crc kubenswrapper[4994]: I0310 00:10:13.872057 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" podStartSLOduration=12.872040886 podStartE2EDuration="12.872040886s" podCreationTimestamp="2026-03-10 00:10:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:13.868932888 +0000 UTC m=+228.042639637" watchObservedRunningTime="2026-03-10 00:10:13.872040886 +0000 UTC m=+228.045747635" Mar 10 00:10:13 crc kubenswrapper[4994]: I0310 00:10:13.873855 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:13 crc kubenswrapper[4994]: E0310 00:10:13.874689 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.874668402 +0000 UTC m=+229.048375151 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:13 crc kubenswrapper[4994]: I0310 00:10:13.893847 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:13 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:13 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:13 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:13 crc kubenswrapper[4994]: I0310 00:10:13.894005 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:13 crc kubenswrapper[4994]: I0310 00:10:13.977530 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:13 crc kubenswrapper[4994]: E0310 00:10:13.979096 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.479075177 +0000 UTC m=+228.652781926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.086771 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.087128 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.587112451 +0000 UTC m=+228.760819200 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.188021 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.188173 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.688149901 +0000 UTC m=+228.861856650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.188217 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.188517 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.68850978 +0000 UTC m=+228.862216519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.289490 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.289669 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.789644013 +0000 UTC m=+228.963350762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.290056 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.290376 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.790369541 +0000 UTC m=+228.964076290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.391589 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.391768 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.891735679 +0000 UTC m=+229.065442428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.392025 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.392406 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.892396645 +0000 UTC m=+229.066103394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.493738 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.493954 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.993918017 +0000 UTC m=+229.167624776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.494551 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.494981 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:14.994964754 +0000 UTC m=+229.168671503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.595812 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.596091 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.096063805 +0000 UTC m=+229.269770564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.596310 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.596757 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.096741102 +0000 UTC m=+229.270447851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.697697 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.697926 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.197862554 +0000 UTC m=+229.371569303 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.698034 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.698338 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.198326355 +0000 UTC m=+229.372033104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.799680 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.799860 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.299832707 +0000 UTC m=+229.473539456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.799992 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.800304 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.300296769 +0000 UTC m=+229.474003518 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.854786 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:14 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:14 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:14 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.854838 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.900488 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.900682 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.400649832 +0000 UTC m=+229.574356581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.900824 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:14 crc kubenswrapper[4994]: E0310 00:10:14.901180 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.401168745 +0000 UTC m=+229.574875494 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.903993 4994 generic.go:334] "Generic (PLEG): container finished" podID="2a178b03-e81c-47af-898a-0463f964e327" containerID="d5e688aeef6f62de6c564f71497e2a41fce184df58655c98c950861c2322f5d8" exitCode=0 Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.904064 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2a178b03-e81c-47af-898a-0463f964e327","Type":"ContainerDied","Data":"d5e688aeef6f62de6c564f71497e2a41fce184df58655c98c950861c2322f5d8"} Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.905666 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551690-7rbl8" event={"ID":"f04aae5d-b067-4e49-82f3-66412ec1bba6","Type":"ContainerStarted","Data":"669d56e78759519de5a6dd239e9cf24e944424e3eb64de05a26d842e32401407"} Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.907580 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551688-9zsf6" event={"ID":"a1456dd8-5038-4bcc-8f19-51325ac84c02","Type":"ContainerStarted","Data":"ee78e5054ad5ad8a035342e7024985079f992d0c77022319d8f1e7f3d55f9eb9"} Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.910285 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" event={"ID":"7fd7640d-700a-420e-b15f-7f681090727b","Type":"ContainerStarted","Data":"41a63e3cad206deed5794d5cf2ffcfa63552cbdbe5b01fdda5d535ac3a1fe33c"} Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.941671 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551688-9zsf6" podStartSLOduration=117.689488346 podStartE2EDuration="2m14.941655258s" podCreationTimestamp="2026-03-10 00:08:00 +0000 UTC" firstStartedPulling="2026-03-10 00:09:53.793929549 +0000 UTC m=+207.967636298" lastFinishedPulling="2026-03-10 00:10:11.046096461 +0000 UTC m=+225.219803210" observedRunningTime="2026-03-10 00:10:14.939637218 +0000 UTC m=+229.113343967" watchObservedRunningTime="2026-03-10 00:10:14.941655258 +0000 UTC m=+229.115362007" Mar 10 00:10:14 crc kubenswrapper[4994]: I0310 00:10:14.953331 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551690-7rbl8" podStartSLOduration=10.642376352 podStartE2EDuration="14.953312929s" podCreationTimestamp="2026-03-10 00:10:00 +0000 UTC" firstStartedPulling="2026-03-10 00:10:06.810199502 +0000 UTC m=+220.983906251" lastFinishedPulling="2026-03-10 00:10:11.121136079 +0000 UTC m=+225.294842828" observedRunningTime="2026-03-10 00:10:14.95014692 +0000 UTC m=+229.123853659" watchObservedRunningTime="2026-03-10 00:10:14.953312929 +0000 UTC m=+229.127019688" Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.002178 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.002370 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.502341097 +0000 UTC m=+229.676047856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.002599 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.004175 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.504161563 +0000 UTC m=+229.677868312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.104309 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.104488 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.604461534 +0000 UTC m=+229.778168293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.104897 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.105248 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.605236634 +0000 UTC m=+229.778943373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.204446 4994 csr.go:261] certificate signing request csr-tm2rk is approved, waiting to be issued Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.207891 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.208001 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.707976686 +0000 UTC m=+229.881683435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.208100 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.208413 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.708405696 +0000 UTC m=+229.882112445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.208700 4994 csr.go:257] certificate signing request csr-tm2rk is issued Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.310016 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.310357 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.810342659 +0000 UTC m=+229.984049408 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.411889 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.412249 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:15.9122355 +0000 UTC m=+230.085942249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.512517 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.512649 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:16.012629844 +0000 UTC m=+230.186336593 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.513035 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.513359 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:16.013343611 +0000 UTC m=+230.187050360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.614365 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.614558 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:16.114530775 +0000 UTC m=+230.288237524 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.614590 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.614939 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:16.114931916 +0000 UTC m=+230.288638655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.715255 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.715592 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:16.215577836 +0000 UTC m=+230.389284585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.727552 4994 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.822505 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.822819 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:16.3228063 +0000 UTC m=+230.496513049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.859800 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:15 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:15 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:15 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.859858 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.874394 4994 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-10T00:10:15.727582296Z","Handler":null,"Name":""} Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.917938 4994 generic.go:334] "Generic (PLEG): container finished" podID="f04aae5d-b067-4e49-82f3-66412ec1bba6" containerID="669d56e78759519de5a6dd239e9cf24e944424e3eb64de05a26d842e32401407" exitCode=0 Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.918006 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551690-7rbl8" event={"ID":"f04aae5d-b067-4e49-82f3-66412ec1bba6","Type":"ContainerDied","Data":"669d56e78759519de5a6dd239e9cf24e944424e3eb64de05a26d842e32401407"} Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.920212 4994 generic.go:334] "Generic (PLEG): container finished" podID="a1456dd8-5038-4bcc-8f19-51325ac84c02" containerID="ee78e5054ad5ad8a035342e7024985079f992d0c77022319d8f1e7f3d55f9eb9" exitCode=0 Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.920256 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551688-9zsf6" event={"ID":"a1456dd8-5038-4bcc-8f19-51325ac84c02","Type":"ContainerDied","Data":"ee78e5054ad5ad8a035342e7024985079f992d0c77022319d8f1e7f3d55f9eb9"} Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.924130 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:15 crc kubenswrapper[4994]: E0310 00:10:15.924357 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:16.424334332 +0000 UTC m=+230.598041121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.932240 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" event={"ID":"7fd7640d-700a-420e-b15f-7f681090727b","Type":"ContainerStarted","Data":"51f4c87762c30d5a257d71a06dfa2c77c08a659ae1961b259ddd8f58a65eeb3f"} Mar 10 00:10:15 crc kubenswrapper[4994]: I0310 00:10:15.932286 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" event={"ID":"7fd7640d-700a-420e-b15f-7f681090727b","Type":"ContainerStarted","Data":"b86c8f290ec062a91de2823bf6419962c51cff1c96848b8dca59f8631b305c5c"} Mar 10 00:10:16 crc kubenswrapper[4994]: I0310 00:10:16.256003 4994 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-02 11:23:51.115399071 +0000 UTC Mar 10 00:10:16 crc kubenswrapper[4994]: I0310 00:10:16.256046 4994 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7163h13m34.859355624s for next certificate rotation Mar 10 00:10:16 crc kubenswrapper[4994]: I0310 00:10:16.257357 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:16 crc kubenswrapper[4994]: E0310 00:10:16.258244 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:16.758233012 +0000 UTC m=+230.931939761 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:16 crc kubenswrapper[4994]: I0310 00:10:16.953734 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:16 crc kubenswrapper[4994]: [-]has-synced failed: reason withheld Mar 10 00:10:16 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:16 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:16 crc kubenswrapper[4994]: I0310 00:10:16.953792 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:16 crc kubenswrapper[4994]: I0310 00:10:16.954411 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:16 crc kubenswrapper[4994]: E0310 00:10:16.979641 4994 goroutinemap.go:150] Operation for "/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" failed. No retries permitted until 2026-03-10 00:10:17.479621795 +0000 UTC m=+231.653328544 (durationBeforeRetry 500ms). Error: RegisterPlugin error -- failed to get plugin info using RPC GetInfo at socket /var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock, err: rpc error: code = DeadlineExceeded desc = context deadline exceeded Mar 10 00:10:16 crc kubenswrapper[4994]: E0310 00:10:16.979746 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:17.979728757 +0000 UTC m=+232.153435506 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:16 crc kubenswrapper[4994]: I0310 00:10:16.990692 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" podUID="7fd7640d-700a-420e-b15f-7f681090727b" containerName="hostpath-provisioner" probeResult="failure" output="Get \"http://10.217.0.45:9898/healthz\": dial tcp 10.217.0.45:9898: connect: connection refused" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.049230 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-cf2xx" podStartSLOduration=28.049209037 podStartE2EDuration="28.049209037s" podCreationTimestamp="2026-03-10 00:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:17.044800396 +0000 UTC m=+231.218507155" watchObservedRunningTime="2026-03-10 00:10:17.049209037 +0000 UTC m=+231.222915776" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.056135 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.057040 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:17.557026112 +0000 UTC m=+231.730732861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.160717 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.161029 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:17.661018086 +0000 UTC m=+231.834724835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.257136 4994 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-01 10:11:04.698311644 +0000 UTC Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.257169 4994 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6394h0m47.4411447s for next certificate rotation Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.262239 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.262475 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:17.762449986 +0000 UTC m=+231.936156735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.301724 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.363738 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.364030 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:17.864018829 +0000 UTC m=+232.037725578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.444081 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551690-7rbl8" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.465260 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.465401 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2a178b03-e81c-47af-898a-0463f964e327-kubelet-dir\") pod \"2a178b03-e81c-47af-898a-0463f964e327\" (UID: \"2a178b03-e81c-47af-898a-0463f964e327\") " Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.465426 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2a178b03-e81c-47af-898a-0463f964e327-kube-api-access\") pod \"2a178b03-e81c-47af-898a-0463f964e327\" (UID: \"2a178b03-e81c-47af-898a-0463f964e327\") " Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.465792 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:17.965759626 +0000 UTC m=+232.139466385 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.465928 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a178b03-e81c-47af-898a-0463f964e327-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2a178b03-e81c-47af-898a-0463f964e327" (UID: "2a178b03-e81c-47af-898a-0463f964e327"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.470803 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a178b03-e81c-47af-898a-0463f964e327-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2a178b03-e81c-47af-898a-0463f964e327" (UID: "2a178b03-e81c-47af-898a-0463f964e327"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.490173 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551688-9zsf6" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.566335 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97ch2\" (UniqueName: \"kubernetes.io/projected/f04aae5d-b067-4e49-82f3-66412ec1bba6-kube-api-access-97ch2\") pod \"f04aae5d-b067-4e49-82f3-66412ec1bba6\" (UID: \"f04aae5d-b067-4e49-82f3-66412ec1bba6\") " Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.566700 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.566755 4994 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2a178b03-e81c-47af-898a-0463f964e327-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.566768 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2a178b03-e81c-47af-898a-0463f964e327-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.567038 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:18.067028551 +0000 UTC m=+232.240735300 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.570769 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f04aae5d-b067-4e49-82f3-66412ec1bba6-kube-api-access-97ch2" (OuterVolumeSpecName: "kube-api-access-97ch2") pod "f04aae5d-b067-4e49-82f3-66412ec1bba6" (UID: "f04aae5d-b067-4e49-82f3-66412ec1bba6"). InnerVolumeSpecName "kube-api-access-97ch2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.668350 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.668427 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4fvw\" (UniqueName: \"kubernetes.io/projected/a1456dd8-5038-4bcc-8f19-51325ac84c02-kube-api-access-m4fvw\") pod \"a1456dd8-5038-4bcc-8f19-51325ac84c02\" (UID: \"a1456dd8-5038-4bcc-8f19-51325ac84c02\") " Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.668591 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:18.168558854 +0000 UTC m=+232.342265613 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.668814 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.669210 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:18.16918125 +0000 UTC m=+232.342887999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.669260 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97ch2\" (UniqueName: \"kubernetes.io/projected/f04aae5d-b067-4e49-82f3-66412ec1bba6-kube-api-access-97ch2\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.672647 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1456dd8-5038-4bcc-8f19-51325ac84c02-kube-api-access-m4fvw" (OuterVolumeSpecName: "kube-api-access-m4fvw") pod "a1456dd8-5038-4bcc-8f19-51325ac84c02" (UID: "a1456dd8-5038-4bcc-8f19-51325ac84c02"). InnerVolumeSpecName "kube-api-access-m4fvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.770684 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.771116 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4fvw\" (UniqueName: \"kubernetes.io/projected/a1456dd8-5038-4bcc-8f19-51325ac84c02-kube-api-access-m4fvw\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.771197 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:10:18.271180013 +0000 UTC m=+232.444886762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.853201 4994 patch_prober.go:28] interesting pod/router-default-5444994796-x6s5d container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:10:17 crc kubenswrapper[4994]: [+]has-synced ok Mar 10 00:10:17 crc kubenswrapper[4994]: [+]process-running ok Mar 10 00:10:17 crc kubenswrapper[4994]: healthz check failed Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.853262 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x6s5d" podUID="f9b1c3de-e5a3-467f-929b-afb8687fb7f0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.872894 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:17 crc kubenswrapper[4994]: E0310 00:10:17.873198 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:10:18.373186588 +0000 UTC m=+232.546893337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-75h8c" (UID: "295cba62-fd24-4245-8773-866ee134a29e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.953438 4994 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-10T00:10:15.727582296Z","Handler":null,"Name":""} Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.958029 4994 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.958065 4994 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.974566 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:10:17 crc kubenswrapper[4994]: I0310 00:10:17.979652 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.014445 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551690-7rbl8" event={"ID":"f04aae5d-b067-4e49-82f3-66412ec1bba6","Type":"ContainerDied","Data":"4311de40639f67ea8a55d45863ea9b8bade3cfa62b21c735dd8d526f7c5e805a"} Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.015101 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4311de40639f67ea8a55d45863ea9b8bade3cfa62b21c735dd8d526f7c5e805a" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.015060 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551690-7rbl8" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.023195 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551688-9zsf6" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.023207 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551688-9zsf6" event={"ID":"a1456dd8-5038-4bcc-8f19-51325ac84c02","Type":"ContainerDied","Data":"5834bb42526a131e58f316ca58bdd93f998331f2c74c80c7a568c0b2a5d292c8"} Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.023554 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5834bb42526a131e58f316ca58bdd93f998331f2c74c80c7a568c0b2a5d292c8" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.026345 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2a178b03-e81c-47af-898a-0463f964e327","Type":"ContainerDied","Data":"c9aac967fb18b6328851a9397e77955184a1e3f93d91c3163e98cfc3de1758cd"} Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.026364 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9aac967fb18b6328851a9397e77955184a1e3f93d91c3163e98cfc3de1758cd" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.026411 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:10:18 crc kubenswrapper[4994]: E0310 00:10:18.062149 4994 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf04aae5d_b067_4e49_82f3_66412ec1bba6.slice\": RecentStats: unable to find data in memory cache]" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.076394 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.452154 4994 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.452228 4994 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.560599 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.606385 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-75h8c\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.625859 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.826635 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-75h8c"] Mar 10 00:10:18 crc kubenswrapper[4994]: W0310 00:10:18.833394 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod295cba62_fd24_4245_8773_866ee134a29e.slice/crio-dd7db90166bf3d060ca8294e206e48fea14498e48c7b2ef7fe0c1e7d9d4dd09f WatchSource:0}: Error finding container dd7db90166bf3d060ca8294e206e48fea14498e48c7b2ef7fe0c1e7d9d4dd09f: Status 404 returned error can't find the container with id dd7db90166bf3d060ca8294e206e48fea14498e48c7b2ef7fe0c1e7d9d4dd09f Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.892778 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.892823 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.894346 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:10:18 crc kubenswrapper[4994]: I0310 00:10:18.904766 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-x6s5d" Mar 10 00:10:19 crc kubenswrapper[4994]: I0310 00:10:19.033801 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" event={"ID":"295cba62-fd24-4245-8773-866ee134a29e","Type":"ContainerStarted","Data":"dd7db90166bf3d060ca8294e206e48fea14498e48c7b2ef7fe0c1e7d9d4dd09f"} Mar 10 00:10:21 crc kubenswrapper[4994]: I0310 00:10:21.069800 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c67ff489b-x49rf"] Mar 10 00:10:21 crc kubenswrapper[4994]: I0310 00:10:21.070177 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg"] Mar 10 00:10:21 crc kubenswrapper[4994]: I0310 00:10:21.072126 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" podUID="51ce0bbc-ee87-47f6-be5d-24f40386cb60" containerName="route-controller-manager" containerID="cri-o://d0af519df7d4889c7f4e2a422bbfa5a0aa335234246b0389b4653db0004e1db2" gracePeriod=30 Mar 10 00:10:21 crc kubenswrapper[4994]: I0310 00:10:21.072676 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" podUID="aa106de9-72a4-4364-a10d-2ec2c543afcf" containerName="controller-manager" containerID="cri-o://eead1cddbea92663bf5592f78a9fd2d9a4a50429baaf76e2d49ea3e824e4b343" gracePeriod=30 Mar 10 00:10:21 crc kubenswrapper[4994]: I0310 00:10:21.082768 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" event={"ID":"295cba62-fd24-4245-8773-866ee134a29e","Type":"ContainerStarted","Data":"0b99026327a0246e8d6a6998d063d7da1dc8dded77c229f16aea5a63dc4137ba"} Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.354202 4994 patch_prober.go:28] interesting pod/console-f9d7485db-rlqtz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.354254 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rlqtz" podUID="11b78073-cc4a-4a6f-89ab-631fde4b3371" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.356282 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.407349 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" podStartSLOduration=191.407329733 podStartE2EDuration="3m11.407329733s" podCreationTimestamp="2026-03-10 00:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:22.397395945 +0000 UTC m=+236.571102694" watchObservedRunningTime="2026-03-10 00:10:22.407329733 +0000 UTC m=+236.581036482" Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.499853 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.500079 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.500238 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.500308 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.500292 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.501342 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"131f6f968be699b4510e1711ff70f7d98fd24e9b749c0ac094982fa64eb070f5"} pod="openshift-console/downloads-7954f5f757-8lrmb" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.501383 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" containerID="cri-o://131f6f968be699b4510e1711ff70f7d98fd24e9b749c0ac094982fa64eb070f5" gracePeriod=2 Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.501652 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:22 crc kubenswrapper[4994]: I0310 00:10:22.501707 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:24 crc kubenswrapper[4994]: I0310 00:10:24.373077 4994 generic.go:334] "Generic (PLEG): container finished" podID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerID="131f6f968be699b4510e1711ff70f7d98fd24e9b749c0ac094982fa64eb070f5" exitCode=0 Mar 10 00:10:24 crc kubenswrapper[4994]: I0310 00:10:24.373254 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8lrmb" event={"ID":"4fb67636-fcba-4975-a460-403cd6ee9c25","Type":"ContainerDied","Data":"131f6f968be699b4510e1711ff70f7d98fd24e9b749c0ac094982fa64eb070f5"} Mar 10 00:10:24 crc kubenswrapper[4994]: I0310 00:10:24.376104 4994 generic.go:334] "Generic (PLEG): container finished" podID="51ce0bbc-ee87-47f6-be5d-24f40386cb60" containerID="d0af519df7d4889c7f4e2a422bbfa5a0aa335234246b0389b4653db0004e1db2" exitCode=0 Mar 10 00:10:24 crc kubenswrapper[4994]: I0310 00:10:24.376172 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" event={"ID":"51ce0bbc-ee87-47f6-be5d-24f40386cb60","Type":"ContainerDied","Data":"d0af519df7d4889c7f4e2a422bbfa5a0aa335234246b0389b4653db0004e1db2"} Mar 10 00:10:24 crc kubenswrapper[4994]: I0310 00:10:24.378490 4994 generic.go:334] "Generic (PLEG): container finished" podID="aa106de9-72a4-4364-a10d-2ec2c543afcf" containerID="eead1cddbea92663bf5592f78a9fd2d9a4a50429baaf76e2d49ea3e824e4b343" exitCode=0 Mar 10 00:10:24 crc kubenswrapper[4994]: I0310 00:10:24.378523 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" event={"ID":"aa106de9-72a4-4364-a10d-2ec2c543afcf","Type":"ContainerDied","Data":"eead1cddbea92663bf5592f78a9fd2d9a4a50429baaf76e2d49ea3e824e4b343"} Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.201245 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.232484 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs"] Mar 10 00:10:30 crc kubenswrapper[4994]: E0310 00:10:30.232987 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04aae5d-b067-4e49-82f3-66412ec1bba6" containerName="oc" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.233004 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04aae5d-b067-4e49-82f3-66412ec1bba6" containerName="oc" Mar 10 00:10:30 crc kubenswrapper[4994]: E0310 00:10:30.233027 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1456dd8-5038-4bcc-8f19-51325ac84c02" containerName="oc" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.233034 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1456dd8-5038-4bcc-8f19-51325ac84c02" containerName="oc" Mar 10 00:10:30 crc kubenswrapper[4994]: E0310 00:10:30.233046 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ce0bbc-ee87-47f6-be5d-24f40386cb60" containerName="route-controller-manager" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.233054 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ce0bbc-ee87-47f6-be5d-24f40386cb60" containerName="route-controller-manager" Mar 10 00:10:30 crc kubenswrapper[4994]: E0310 00:10:30.233072 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a178b03-e81c-47af-898a-0463f964e327" containerName="pruner" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.233080 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a178b03-e81c-47af-898a-0463f964e327" containerName="pruner" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.233300 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1456dd8-5038-4bcc-8f19-51325ac84c02" containerName="oc" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.233316 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a178b03-e81c-47af-898a-0463f964e327" containerName="pruner" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.233330 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f04aae5d-b067-4e49-82f3-66412ec1bba6" containerName="oc" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.233346 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ce0bbc-ee87-47f6-be5d-24f40386cb60" containerName="route-controller-manager" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.233909 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.248567 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs"] Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.386806 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-client-ca\") pod \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.387005 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51ce0bbc-ee87-47f6-be5d-24f40386cb60-serving-cert\") pod \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.387043 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-config\") pod \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.387075 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vb2m\" (UniqueName: \"kubernetes.io/projected/51ce0bbc-ee87-47f6-be5d-24f40386cb60-kube-api-access-6vb2m\") pod \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\" (UID: \"51ce0bbc-ee87-47f6-be5d-24f40386cb60\") " Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.387254 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9ffb940-ad05-42fe-99dc-2ca36481a566-serving-cert\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.387319 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-config\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.387346 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-client-ca\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.387386 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r94jz\" (UniqueName: \"kubernetes.io/projected/f9ffb940-ad05-42fe-99dc-2ca36481a566-kube-api-access-r94jz\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.387967 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-config" (OuterVolumeSpecName: "config") pod "51ce0bbc-ee87-47f6-be5d-24f40386cb60" (UID: "51ce0bbc-ee87-47f6-be5d-24f40386cb60"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.388112 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-client-ca" (OuterVolumeSpecName: "client-ca") pod "51ce0bbc-ee87-47f6-be5d-24f40386cb60" (UID: "51ce0bbc-ee87-47f6-be5d-24f40386cb60"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.395129 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ce0bbc-ee87-47f6-be5d-24f40386cb60-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "51ce0bbc-ee87-47f6-be5d-24f40386cb60" (UID: "51ce0bbc-ee87-47f6-be5d-24f40386cb60"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.401139 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ce0bbc-ee87-47f6-be5d-24f40386cb60-kube-api-access-6vb2m" (OuterVolumeSpecName: "kube-api-access-6vb2m") pod "51ce0bbc-ee87-47f6-be5d-24f40386cb60" (UID: "51ce0bbc-ee87-47f6-be5d-24f40386cb60"). InnerVolumeSpecName "kube-api-access-6vb2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.413042 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" event={"ID":"51ce0bbc-ee87-47f6-be5d-24f40386cb60","Type":"ContainerDied","Data":"b99811bd76278a20c75ea5a5530b5792fd876cafc8ee3f721f73cedbdb0b24d7"} Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.413157 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.413312 4994 scope.go:117] "RemoveContainer" containerID="d0af519df7d4889c7f4e2a422bbfa5a0aa335234246b0389b4653db0004e1db2" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.463361 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg"] Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.469467 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg"] Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.489754 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9ffb940-ad05-42fe-99dc-2ca36481a566-serving-cert\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.489968 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-config\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.492377 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-config\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.494364 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-client-ca\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.492567 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-client-ca\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.494550 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r94jz\" (UniqueName: \"kubernetes.io/projected/f9ffb940-ad05-42fe-99dc-2ca36481a566-kube-api-access-r94jz\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.495243 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.495311 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vb2m\" (UniqueName: \"kubernetes.io/projected/51ce0bbc-ee87-47f6-be5d-24f40386cb60-kube-api-access-6vb2m\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.495256 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9ffb940-ad05-42fe-99dc-2ca36481a566-serving-cert\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.495344 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51ce0bbc-ee87-47f6-be5d-24f40386cb60-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.495512 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51ce0bbc-ee87-47f6-be5d-24f40386cb60-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.514322 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r94jz\" (UniqueName: \"kubernetes.io/projected/f9ffb940-ad05-42fe-99dc-2ca36481a566-kube-api-access-r94jz\") pod \"route-controller-manager-6dfb7c6d46-hxxbs\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.564330 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:10:30 crc kubenswrapper[4994]: I0310 00:10:30.565554 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51ce0bbc-ee87-47f6-be5d-24f40386cb60" path="/var/lib/kubelet/pods/51ce0bbc-ee87-47f6-be5d-24f40386cb60/volumes" Mar 10 00:10:31 crc kubenswrapper[4994]: I0310 00:10:31.012277 4994 patch_prober.go:28] interesting pod/route-controller-manager-6df8f76c79-nrqgg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:10:31 crc kubenswrapper[4994]: I0310 00:10:31.012366 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6df8f76c79-nrqgg" podUID="51ce0bbc-ee87-47f6-be5d-24f40386cb60" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:10:31 crc kubenswrapper[4994]: I0310 00:10:31.017637 4994 patch_prober.go:28] interesting pod/controller-manager-5c67ff489b-x49rf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:10:31 crc kubenswrapper[4994]: I0310 00:10:31.017685 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" podUID="aa106de9-72a4-4364-a10d-2ec2c543afcf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:10:32 crc kubenswrapper[4994]: I0310 00:10:32.190659 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:10:32 crc kubenswrapper[4994]: I0310 00:10:32.198825 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-rlqtz" Mar 10 00:10:32 crc kubenswrapper[4994]: I0310 00:10:32.287929 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cvds8" Mar 10 00:10:32 crc kubenswrapper[4994]: I0310 00:10:32.499300 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:32 crc kubenswrapper[4994]: I0310 00:10:32.499350 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:34 crc kubenswrapper[4994]: I0310 00:10:34.980669 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 00:10:34 crc kubenswrapper[4994]: I0310 00:10:34.981932 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:10:34 crc kubenswrapper[4994]: I0310 00:10:34.984062 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 00:10:34 crc kubenswrapper[4994]: I0310 00:10:34.984540 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 00:10:34 crc kubenswrapper[4994]: I0310 00:10:34.989585 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 00:10:35 crc kubenswrapper[4994]: I0310 00:10:35.068517 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:10:35 crc kubenswrapper[4994]: I0310 00:10:35.068675 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:10:35 crc kubenswrapper[4994]: I0310 00:10:35.169795 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:10:35 crc kubenswrapper[4994]: I0310 00:10:35.169862 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:10:35 crc kubenswrapper[4994]: I0310 00:10:35.170258 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:10:35 crc kubenswrapper[4994]: I0310 00:10:35.200243 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:10:35 crc kubenswrapper[4994]: I0310 00:10:35.302069 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:10:37 crc kubenswrapper[4994]: I0310 00:10:37.023022 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:10:38 crc kubenswrapper[4994]: I0310 00:10:38.638419 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.774703 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.776711 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.797540 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.890521 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kube-api-access\") pod \"installer-9-crc\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.890741 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.890878 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-var-lock\") pod \"installer-9-crc\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.992150 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kube-api-access\") pod \"installer-9-crc\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.992280 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.992369 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-var-lock\") pod \"installer-9-crc\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.992421 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:40 crc kubenswrapper[4994]: I0310 00:10:40.992472 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-var-lock\") pod \"installer-9-crc\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:41 crc kubenswrapper[4994]: I0310 00:10:41.017151 4994 patch_prober.go:28] interesting pod/controller-manager-5c67ff489b-x49rf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:10:41 crc kubenswrapper[4994]: I0310 00:10:41.017221 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" podUID="aa106de9-72a4-4364-a10d-2ec2c543afcf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:10:41 crc kubenswrapper[4994]: I0310 00:10:41.018380 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kube-api-access\") pod \"installer-9-crc\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:41 crc kubenswrapper[4994]: I0310 00:10:41.115010 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:41 crc kubenswrapper[4994]: I0310 00:10:41.149341 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs"] Mar 10 00:10:42 crc kubenswrapper[4994]: I0310 00:10:42.501102 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:42 crc kubenswrapper[4994]: I0310 00:10:42.501185 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:48 crc kubenswrapper[4994]: I0310 00:10:48.893230 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:10:48 crc kubenswrapper[4994]: I0310 00:10:48.893327 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:10:51 crc kubenswrapper[4994]: I0310 00:10:51.022430 4994 patch_prober.go:28] interesting pod/controller-manager-5c67ff489b-x49rf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:10:51 crc kubenswrapper[4994]: I0310 00:10:51.022948 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" podUID="aa106de9-72a4-4364-a10d-2ec2c543afcf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:10:51 crc kubenswrapper[4994]: I0310 00:10:51.966862 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.011305 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7987b4b568-rd8h8"] Mar 10 00:10:52 crc kubenswrapper[4994]: E0310 00:10:52.011557 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa106de9-72a4-4364-a10d-2ec2c543afcf" containerName="controller-manager" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.011572 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa106de9-72a4-4364-a10d-2ec2c543afcf" containerName="controller-manager" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.011733 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa106de9-72a4-4364-a10d-2ec2c543afcf" containerName="controller-manager" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.012242 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.048551 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7987b4b568-rd8h8"] Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.135830 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-client-ca\") pod \"aa106de9-72a4-4364-a10d-2ec2c543afcf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.136234 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-proxy-ca-bundles\") pod \"aa106de9-72a4-4364-a10d-2ec2c543afcf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.136356 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-config\") pod \"aa106de9-72a4-4364-a10d-2ec2c543afcf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.136478 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7jgf\" (UniqueName: \"kubernetes.io/projected/aa106de9-72a4-4364-a10d-2ec2c543afcf-kube-api-access-c7jgf\") pod \"aa106de9-72a4-4364-a10d-2ec2c543afcf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.136574 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa106de9-72a4-4364-a10d-2ec2c543afcf-serving-cert\") pod \"aa106de9-72a4-4364-a10d-2ec2c543afcf\" (UID: \"aa106de9-72a4-4364-a10d-2ec2c543afcf\") " Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.136814 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-client-ca\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.136924 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9lft\" (UniqueName: \"kubernetes.io/projected/ae5ec419-c993-43ff-b664-703b8b5a3d5a-kube-api-access-p9lft\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.137000 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-proxy-ca-bundles\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.137008 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "aa106de9-72a4-4364-a10d-2ec2c543afcf" (UID: "aa106de9-72a4-4364-a10d-2ec2c543afcf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.137035 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-client-ca" (OuterVolumeSpecName: "client-ca") pod "aa106de9-72a4-4364-a10d-2ec2c543afcf" (UID: "aa106de9-72a4-4364-a10d-2ec2c543afcf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.137182 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-config" (OuterVolumeSpecName: "config") pod "aa106de9-72a4-4364-a10d-2ec2c543afcf" (UID: "aa106de9-72a4-4364-a10d-2ec2c543afcf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.137193 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-config\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.137403 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae5ec419-c993-43ff-b664-703b8b5a3d5a-serving-cert\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.137554 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.137577 4994 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.137591 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa106de9-72a4-4364-a10d-2ec2c543afcf-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.141216 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa106de9-72a4-4364-a10d-2ec2c543afcf-kube-api-access-c7jgf" (OuterVolumeSpecName: "kube-api-access-c7jgf") pod "aa106de9-72a4-4364-a10d-2ec2c543afcf" (UID: "aa106de9-72a4-4364-a10d-2ec2c543afcf"). InnerVolumeSpecName "kube-api-access-c7jgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.145473 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa106de9-72a4-4364-a10d-2ec2c543afcf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "aa106de9-72a4-4364-a10d-2ec2c543afcf" (UID: "aa106de9-72a4-4364-a10d-2ec2c543afcf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.238682 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-client-ca\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.238736 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9lft\" (UniqueName: \"kubernetes.io/projected/ae5ec419-c993-43ff-b664-703b8b5a3d5a-kube-api-access-p9lft\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.238762 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-proxy-ca-bundles\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.238794 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-config\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.238847 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae5ec419-c993-43ff-b664-703b8b5a3d5a-serving-cert\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.238918 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7jgf\" (UniqueName: \"kubernetes.io/projected/aa106de9-72a4-4364-a10d-2ec2c543afcf-kube-api-access-c7jgf\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.238933 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa106de9-72a4-4364-a10d-2ec2c543afcf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.242431 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae5ec419-c993-43ff-b664-703b8b5a3d5a-serving-cert\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.315021 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-client-ca\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.316589 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-proxy-ca-bundles\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.317072 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-config\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.321714 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9lft\" (UniqueName: \"kubernetes.io/projected/ae5ec419-c993-43ff-b664-703b8b5a3d5a-kube-api-access-p9lft\") pod \"controller-manager-7987b4b568-rd8h8\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.358678 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.501122 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.501253 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.593304 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" event={"ID":"aa106de9-72a4-4364-a10d-2ec2c543afcf","Type":"ContainerDied","Data":"8a78cdbee32124e2065f39d9cf54d4202c2bf87b8ec1b372bda9861fe5ee8d02"} Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.593770 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c67ff489b-x49rf" Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.639996 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c67ff489b-x49rf"] Mar 10 00:10:52 crc kubenswrapper[4994]: I0310 00:10:52.646258 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5c67ff489b-x49rf"] Mar 10 00:10:54 crc kubenswrapper[4994]: I0310 00:10:54.561400 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa106de9-72a4-4364-a10d-2ec2c543afcf" path="/var/lib/kubelet/pods/aa106de9-72a4-4364-a10d-2ec2c543afcf/volumes" Mar 10 00:11:00 crc kubenswrapper[4994]: I0310 00:11:00.506791 4994 scope.go:117] "RemoveContainer" containerID="eead1cddbea92663bf5592f78a9fd2d9a4a50429baaf76e2d49ea3e824e4b343" Mar 10 00:11:02 crc kubenswrapper[4994]: I0310 00:11:02.499154 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:02 crc kubenswrapper[4994]: I0310 00:11:02.499784 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:03 crc kubenswrapper[4994]: E0310 00:11:03.099084 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 10 00:11:03 crc kubenswrapper[4994]: E0310 00:11:03.099300 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2fbrm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-s7qcn_openshift-marketplace(abe30cce-8379-4db8-838b-f48b4bc96621): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:11:03 crc kubenswrapper[4994]: E0310 00:11:03.100669 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-s7qcn" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" Mar 10 00:11:04 crc kubenswrapper[4994]: E0310 00:11:04.804909 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-s7qcn" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" Mar 10 00:11:05 crc kubenswrapper[4994]: E0310 00:11:05.011094 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 10 00:11:05 crc kubenswrapper[4994]: E0310 00:11:05.011238 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l95st,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bwzk5_openshift-marketplace(fdad0261-804d-41dc-8a25-48018f136c0f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:11:05 crc kubenswrapper[4994]: E0310 00:11:05.012595 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bwzk5" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" Mar 10 00:11:05 crc kubenswrapper[4994]: E0310 00:11:05.116730 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 10 00:11:05 crc kubenswrapper[4994]: E0310 00:11:05.116891 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xfh8s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-c4tz9_openshift-marketplace(ab6cd76f-6272-4fcd-8c75-3040c45ef1b5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:11:05 crc kubenswrapper[4994]: E0310 00:11:05.118413 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-c4tz9" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" Mar 10 00:11:08 crc kubenswrapper[4994]: E0310 00:11:08.946226 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bwzk5" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" Mar 10 00:11:08 crc kubenswrapper[4994]: E0310 00:11:08.946316 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-c4tz9" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" Mar 10 00:11:09 crc kubenswrapper[4994]: E0310 00:11:09.497690 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 10 00:11:09 crc kubenswrapper[4994]: E0310 00:11:09.498120 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-frwzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-t5kj4_openshift-marketplace(0429fae4-1356-4d61-86a3-267f74f27636): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:11:09 crc kubenswrapper[4994]: E0310 00:11:09.499377 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-t5kj4" podUID="0429fae4-1356-4d61-86a3-267f74f27636" Mar 10 00:11:10 crc kubenswrapper[4994]: E0310 00:11:10.532640 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-t5kj4" podUID="0429fae4-1356-4d61-86a3-267f74f27636" Mar 10 00:11:10 crc kubenswrapper[4994]: E0310 00:11:10.712933 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 10 00:11:10 crc kubenswrapper[4994]: E0310 00:11:10.713176 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fdkv4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-bzrd2_openshift-marketplace(64ec1b6f-2c0f-4cfc-be18-a2d311fae68c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:11:10 crc kubenswrapper[4994]: E0310 00:11:10.714348 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-bzrd2" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" Mar 10 00:11:11 crc kubenswrapper[4994]: E0310 00:11:11.632642 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 10 00:11:11 crc kubenswrapper[4994]: E0310 00:11:11.633111 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5lx2s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-zv2kt_openshift-marketplace(76aa065c-ed60-4237-b36f-5ce2865256ff): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:11:11 crc kubenswrapper[4994]: E0310 00:11:11.635034 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-zv2kt" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" Mar 10 00:11:11 crc kubenswrapper[4994]: E0310 00:11:11.736173 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-zv2kt" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" Mar 10 00:11:11 crc kubenswrapper[4994]: E0310 00:11:11.737034 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-bzrd2" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" Mar 10 00:11:11 crc kubenswrapper[4994]: I0310 00:11:11.779087 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 00:11:11 crc kubenswrapper[4994]: I0310 00:11:11.798114 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs"] Mar 10 00:11:11 crc kubenswrapper[4994]: W0310 00:11:11.802247 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9ffb940_ad05_42fe_99dc_2ca36481a566.slice/crio-0ec14e3f57f4022c2cac12e0b6ad39a5f8eadbca12ac88bfd6e35f901488d9ac WatchSource:0}: Error finding container 0ec14e3f57f4022c2cac12e0b6ad39a5f8eadbca12ac88bfd6e35f901488d9ac: Status 404 returned error can't find the container with id 0ec14e3f57f4022c2cac12e0b6ad39a5f8eadbca12ac88bfd6e35f901488d9ac Mar 10 00:11:11 crc kubenswrapper[4994]: I0310 00:11:11.881061 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 00:11:11 crc kubenswrapper[4994]: I0310 00:11:11.905958 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7987b4b568-rd8h8"] Mar 10 00:11:11 crc kubenswrapper[4994]: W0310 00:11:11.910413 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3b7fc03e_d1af_479d_9315_0f25283f3aa1.slice/crio-561b9844b5f034c62f159c5807ee8bec23ed39a360cd1fa632787a672e2ff7e3 WatchSource:0}: Error finding container 561b9844b5f034c62f159c5807ee8bec23ed39a360cd1fa632787a672e2ff7e3: Status 404 returned error can't find the container with id 561b9844b5f034c62f159c5807ee8bec23ed39a360cd1fa632787a672e2ff7e3 Mar 10 00:11:11 crc kubenswrapper[4994]: W0310 00:11:11.928580 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae5ec419_c993_43ff_b664_703b8b5a3d5a.slice/crio-83f86e0cd6062d10a147cbaca0b7678a7e90a57f0aee2f43c47901ded05219f4 WatchSource:0}: Error finding container 83f86e0cd6062d10a147cbaca0b7678a7e90a57f0aee2f43c47901ded05219f4: Status 404 returned error can't find the container with id 83f86e0cd6062d10a147cbaca0b7678a7e90a57f0aee2f43c47901ded05219f4 Mar 10 00:11:12 crc kubenswrapper[4994]: E0310 00:11:12.112899 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 10 00:11:12 crc kubenswrapper[4994]: E0310 00:11:12.115181 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-btt9c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hrh9x_openshift-marketplace(a4a4dc2d-502f-4c05-ab76-1cc708f13006): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:11:12 crc kubenswrapper[4994]: E0310 00:11:12.116563 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hrh9x" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.499811 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.500681 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.738996 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpd8k" event={"ID":"6525b40b-1c23-4533-a025-4d86bc406f00","Type":"ContainerStarted","Data":"c11c2f647a7cbb01d788b2a61a4106505500b1d0634fb464e68d4e4b2d159f7e"} Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.740100 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3b7fc03e-d1af-479d-9315-0f25283f3aa1","Type":"ContainerStarted","Data":"561b9844b5f034c62f159c5807ee8bec23ed39a360cd1fa632787a672e2ff7e3"} Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.741956 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" event={"ID":"ae5ec419-c993-43ff-b664-703b8b5a3d5a","Type":"ContainerStarted","Data":"83f86e0cd6062d10a147cbaca0b7678a7e90a57f0aee2f43c47901ded05219f4"} Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.744164 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" event={"ID":"f9ffb940-ad05-42fe-99dc-2ca36481a566","Type":"ContainerStarted","Data":"3ffee0ac703d08dbee3e08fe1458ad708a9d5512f365eb71414438145464290e"} Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.744199 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" event={"ID":"f9ffb940-ad05-42fe-99dc-2ca36481a566","Type":"ContainerStarted","Data":"0ec14e3f57f4022c2cac12e0b6ad39a5f8eadbca12ac88bfd6e35f901488d9ac"} Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.745293 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e7e1505e-1226-47f8-8e43-6ac30a4ff867","Type":"ContainerStarted","Data":"929c6f36bcb8cd761948647dfebae867393436e978350669dcfa51d6e35aaf48"} Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.748705 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8lrmb" event={"ID":"4fb67636-fcba-4975-a460-403cd6ee9c25","Type":"ContainerStarted","Data":"657187cd3846e2eb040a978c60bc565d95b249e43315c01390881e18407946eb"} Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.748734 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.748804 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:12 crc kubenswrapper[4994]: I0310 00:11:12.748831 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:12 crc kubenswrapper[4994]: E0310 00:11:12.748970 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hrh9x" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.753972 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e7e1505e-1226-47f8-8e43-6ac30a4ff867","Type":"ContainerStarted","Data":"9048ce472e16494b257b9311b0a7e6443400f3bafde71adbbc641d8d3eac2afb"} Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.756434 4994 generic.go:334] "Generic (PLEG): container finished" podID="6525b40b-1c23-4533-a025-4d86bc406f00" containerID="c11c2f647a7cbb01d788b2a61a4106505500b1d0634fb464e68d4e4b2d159f7e" exitCode=0 Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.756498 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpd8k" event={"ID":"6525b40b-1c23-4533-a025-4d86bc406f00","Type":"ContainerDied","Data":"c11c2f647a7cbb01d788b2a61a4106505500b1d0634fb464e68d4e4b2d159f7e"} Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.761367 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3b7fc03e-d1af-479d-9315-0f25283f3aa1","Type":"ContainerStarted","Data":"00cdb03a355b97c57210fc1571c46882321cfc03a28cec4f9df0486d60425178"} Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.763962 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" event={"ID":"ae5ec419-c993-43ff-b664-703b8b5a3d5a","Type":"ContainerStarted","Data":"436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514"} Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.764044 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.764083 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" podUID="f9ffb940-ad05-42fe-99dc-2ca36481a566" containerName="route-controller-manager" containerID="cri-o://3ffee0ac703d08dbee3e08fe1458ad708a9d5512f365eb71414438145464290e" gracePeriod=30 Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.764105 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.764131 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.764083 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.769944 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.772943 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.798931 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" podStartSLOduration=52.798906695 podStartE2EDuration="52.798906695s" podCreationTimestamp="2026-03-10 00:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:11:13.796460493 +0000 UTC m=+287.970167302" watchObservedRunningTime="2026-03-10 00:11:13.798906695 +0000 UTC m=+287.972613444" Mar 10 00:11:13 crc kubenswrapper[4994]: I0310 00:11:13.853205 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" podStartSLOduration=32.853180257 podStartE2EDuration="32.853180257s" podCreationTimestamp="2026-03-10 00:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:11:13.851519985 +0000 UTC m=+288.025226744" watchObservedRunningTime="2026-03-10 00:11:13.853180257 +0000 UTC m=+288.026887026" Mar 10 00:11:14 crc kubenswrapper[4994]: I0310 00:11:14.774497 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" event={"ID":"f9ffb940-ad05-42fe-99dc-2ca36481a566","Type":"ContainerDied","Data":"3ffee0ac703d08dbee3e08fe1458ad708a9d5512f365eb71414438145464290e"} Mar 10 00:11:14 crc kubenswrapper[4994]: I0310 00:11:14.774583 4994 generic.go:334] "Generic (PLEG): container finished" podID="f9ffb940-ad05-42fe-99dc-2ca36481a566" containerID="3ffee0ac703d08dbee3e08fe1458ad708a9d5512f365eb71414438145464290e" exitCode=0 Mar 10 00:11:14 crc kubenswrapper[4994]: I0310 00:11:14.778600 4994 generic.go:334] "Generic (PLEG): container finished" podID="3b7fc03e-d1af-479d-9315-0f25283f3aa1" containerID="00cdb03a355b97c57210fc1571c46882321cfc03a28cec4f9df0486d60425178" exitCode=0 Mar 10 00:11:14 crc kubenswrapper[4994]: I0310 00:11:14.778765 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3b7fc03e-d1af-479d-9315-0f25283f3aa1","Type":"ContainerDied","Data":"00cdb03a355b97c57210fc1571c46882321cfc03a28cec4f9df0486d60425178"} Mar 10 00:11:14 crc kubenswrapper[4994]: I0310 00:11:14.842847 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=34.842819648 podStartE2EDuration="34.842819648s" podCreationTimestamp="2026-03-10 00:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:11:14.83224563 +0000 UTC m=+289.005952419" watchObservedRunningTime="2026-03-10 00:11:14.842819648 +0000 UTC m=+289.016526437" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.071700 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.076131 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.105131 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg"] Mar 10 00:11:16 crc kubenswrapper[4994]: E0310 00:11:16.105518 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ffb940-ad05-42fe-99dc-2ca36481a566" containerName="route-controller-manager" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.105597 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ffb940-ad05-42fe-99dc-2ca36481a566" containerName="route-controller-manager" Mar 10 00:11:16 crc kubenswrapper[4994]: E0310 00:11:16.105672 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b7fc03e-d1af-479d-9315-0f25283f3aa1" containerName="pruner" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.105725 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b7fc03e-d1af-479d-9315-0f25283f3aa1" containerName="pruner" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.105883 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b7fc03e-d1af-479d-9315-0f25283f3aa1" containerName="pruner" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.105954 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ffb940-ad05-42fe-99dc-2ca36481a566" containerName="route-controller-manager" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.106380 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.117985 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg"] Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.154941 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-client-ca\") pod \"f9ffb940-ad05-42fe-99dc-2ca36481a566\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.155042 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r94jz\" (UniqueName: \"kubernetes.io/projected/f9ffb940-ad05-42fe-99dc-2ca36481a566-kube-api-access-r94jz\") pod \"f9ffb940-ad05-42fe-99dc-2ca36481a566\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.155110 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9ffb940-ad05-42fe-99dc-2ca36481a566-serving-cert\") pod \"f9ffb940-ad05-42fe-99dc-2ca36481a566\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.155143 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kube-api-access\") pod \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\" (UID: \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\") " Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.155209 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-config\") pod \"f9ffb940-ad05-42fe-99dc-2ca36481a566\" (UID: \"f9ffb940-ad05-42fe-99dc-2ca36481a566\") " Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.155234 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kubelet-dir\") pod \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\" (UID: \"3b7fc03e-d1af-479d-9315-0f25283f3aa1\") " Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.155525 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3b7fc03e-d1af-479d-9315-0f25283f3aa1" (UID: "3b7fc03e-d1af-479d-9315-0f25283f3aa1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.156351 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-client-ca" (OuterVolumeSpecName: "client-ca") pod "f9ffb940-ad05-42fe-99dc-2ca36481a566" (UID: "f9ffb940-ad05-42fe-99dc-2ca36481a566"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.160225 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-config" (OuterVolumeSpecName: "config") pod "f9ffb940-ad05-42fe-99dc-2ca36481a566" (UID: "f9ffb940-ad05-42fe-99dc-2ca36481a566"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.162647 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ffb940-ad05-42fe-99dc-2ca36481a566-kube-api-access-r94jz" (OuterVolumeSpecName: "kube-api-access-r94jz") pod "f9ffb940-ad05-42fe-99dc-2ca36481a566" (UID: "f9ffb940-ad05-42fe-99dc-2ca36481a566"). InnerVolumeSpecName "kube-api-access-r94jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.164742 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9ffb940-ad05-42fe-99dc-2ca36481a566-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f9ffb940-ad05-42fe-99dc-2ca36481a566" (UID: "f9ffb940-ad05-42fe-99dc-2ca36481a566"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.166501 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3b7fc03e-d1af-479d-9315-0f25283f3aa1" (UID: "3b7fc03e-d1af-479d-9315-0f25283f3aa1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256091 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-client-ca\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256141 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-config\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256159 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3653335d-178c-4df8-a93d-4d19011298fe-serving-cert\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256185 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8zbr\" (UniqueName: \"kubernetes.io/projected/3653335d-178c-4df8-a93d-4d19011298fe-kube-api-access-f8zbr\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256238 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r94jz\" (UniqueName: \"kubernetes.io/projected/f9ffb940-ad05-42fe-99dc-2ca36481a566-kube-api-access-r94jz\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256249 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9ffb940-ad05-42fe-99dc-2ca36481a566-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256258 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256268 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256277 4994 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b7fc03e-d1af-479d-9315-0f25283f3aa1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.256284 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9ffb940-ad05-42fe-99dc-2ca36481a566-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.357940 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-client-ca\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.358004 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-config\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.358027 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3653335d-178c-4df8-a93d-4d19011298fe-serving-cert\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.358067 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8zbr\" (UniqueName: \"kubernetes.io/projected/3653335d-178c-4df8-a93d-4d19011298fe-kube-api-access-f8zbr\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.359918 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-client-ca\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.361474 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-config\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.363824 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3653335d-178c-4df8-a93d-4d19011298fe-serving-cert\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.379484 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8zbr\" (UniqueName: \"kubernetes.io/projected/3653335d-178c-4df8-a93d-4d19011298fe-kube-api-access-f8zbr\") pod \"route-controller-manager-5c6cb646fb-vc9sg\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.444288 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.794398 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3b7fc03e-d1af-479d-9315-0f25283f3aa1","Type":"ContainerDied","Data":"561b9844b5f034c62f159c5807ee8bec23ed39a360cd1fa632787a672e2ff7e3"} Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.794458 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="561b9844b5f034c62f159c5807ee8bec23ed39a360cd1fa632787a672e2ff7e3" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.794540 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.798209 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" event={"ID":"f9ffb940-ad05-42fe-99dc-2ca36481a566","Type":"ContainerDied","Data":"0ec14e3f57f4022c2cac12e0b6ad39a5f8eadbca12ac88bfd6e35f901488d9ac"} Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.798273 4994 scope.go:117] "RemoveContainer" containerID="3ffee0ac703d08dbee3e08fe1458ad708a9d5512f365eb71414438145464290e" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.798410 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs" Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.833488 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs"] Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.840740 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dfb7c6d46-hxxbs"] Mar 10 00:11:16 crc kubenswrapper[4994]: I0310 00:11:16.971317 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg"] Mar 10 00:11:17 crc kubenswrapper[4994]: I0310 00:11:17.808466 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" event={"ID":"3653335d-178c-4df8-a93d-4d19011298fe","Type":"ContainerStarted","Data":"86f6391bf2290461b2367cbde0871fb0815774ff3d6099505b8889c9a6ec884a"} Mar 10 00:11:18 crc kubenswrapper[4994]: I0310 00:11:18.574007 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ffb940-ad05-42fe-99dc-2ca36481a566" path="/var/lib/kubelet/pods/f9ffb940-ad05-42fe-99dc-2ca36481a566/volumes" Mar 10 00:11:18 crc kubenswrapper[4994]: I0310 00:11:18.820854 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" event={"ID":"3653335d-178c-4df8-a93d-4d19011298fe","Type":"ContainerStarted","Data":"fb93dc31c2039db01d5ef603fb8cadf69dce66a5580028d2b9b19cc49d45e88a"} Mar 10 00:11:18 crc kubenswrapper[4994]: I0310 00:11:18.892245 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:11:18 crc kubenswrapper[4994]: I0310 00:11:18.892576 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:11:18 crc kubenswrapper[4994]: I0310 00:11:18.892775 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:11:18 crc kubenswrapper[4994]: I0310 00:11:18.895228 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6"} pod="openshift-machine-config-operator/machine-config-daemon-kfljj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:11:18 crc kubenswrapper[4994]: I0310 00:11:18.895398 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" containerID="cri-o://345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6" gracePeriod=600 Mar 10 00:11:19 crc kubenswrapper[4994]: I0310 00:11:19.831357 4994 generic.go:334] "Generic (PLEG): container finished" podID="ced5d66d-39df-4267-b801-e1e60d517ace" containerID="345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6" exitCode=0 Mar 10 00:11:19 crc kubenswrapper[4994]: I0310 00:11:19.831578 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerDied","Data":"345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6"} Mar 10 00:11:19 crc kubenswrapper[4994]: I0310 00:11:19.832434 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:19 crc kubenswrapper[4994]: I0310 00:11:19.844470 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:11:19 crc kubenswrapper[4994]: I0310 00:11:19.859482 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" podStartSLOduration=38.859465542 podStartE2EDuration="38.859465542s" podCreationTimestamp="2026-03-10 00:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:11:19.855782768 +0000 UTC m=+294.029489557" watchObservedRunningTime="2026-03-10 00:11:19.859465542 +0000 UTC m=+294.033172291" Mar 10 00:11:22 crc kubenswrapper[4994]: I0310 00:11:22.499910 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:22 crc kubenswrapper[4994]: I0310 00:11:22.499997 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:22 crc kubenswrapper[4994]: I0310 00:11:22.500394 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:22 crc kubenswrapper[4994]: I0310 00:11:22.500454 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:23 crc kubenswrapper[4994]: I0310 00:11:23.865779 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"e0758e06b050c72a5ba1ca15578add547da69884a82997478d49f051fd653d6f"} Mar 10 00:11:24 crc kubenswrapper[4994]: I0310 00:11:24.875822 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpd8k" event={"ID":"6525b40b-1c23-4533-a025-4d86bc406f00","Type":"ContainerStarted","Data":"68b888f4cbe6324465ef819f044ee90195b5c11bf20ab3d53a6faef3352a36c6"} Mar 10 00:11:24 crc kubenswrapper[4994]: I0310 00:11:24.919526 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wpd8k" podStartSLOduration=9.879826556 podStartE2EDuration="1m19.919505402s" podCreationTimestamp="2026-03-10 00:10:05 +0000 UTC" firstStartedPulling="2026-03-10 00:10:12.443648073 +0000 UTC m=+226.617354832" lastFinishedPulling="2026-03-10 00:11:22.483326899 +0000 UTC m=+296.657033678" observedRunningTime="2026-03-10 00:11:24.916090126 +0000 UTC m=+299.089796885" watchObservedRunningTime="2026-03-10 00:11:24.919505402 +0000 UTC m=+299.093212161" Mar 10 00:11:25 crc kubenswrapper[4994]: I0310 00:11:25.954323 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:11:25 crc kubenswrapper[4994]: I0310 00:11:25.954412 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:11:28 crc kubenswrapper[4994]: I0310 00:11:28.828301 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wpd8k" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="registry-server" probeResult="failure" output=< Mar 10 00:11:28 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:11:28 crc kubenswrapper[4994]: > Mar 10 00:11:32 crc kubenswrapper[4994]: I0310 00:11:32.499300 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:32 crc kubenswrapper[4994]: I0310 00:11:32.499300 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:32 crc kubenswrapper[4994]: I0310 00:11:32.499679 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:32 crc kubenswrapper[4994]: I0310 00:11:32.499710 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:36 crc kubenswrapper[4994]: I0310 00:11:36.949973 4994 generic.go:334] "Generic (PLEG): container finished" podID="0779a70e-ebf5-4e98-87ea-43017b8d1e46" containerID="a4cfa8b96c6aa5123624fb879c0f68820a0d96a764fb960d5b7561f433ae5dad" exitCode=0 Mar 10 00:11:36 crc kubenswrapper[4994]: I0310 00:11:36.950680 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29551680-sz8pz" event={"ID":"0779a70e-ebf5-4e98-87ea-43017b8d1e46","Type":"ContainerDied","Data":"a4cfa8b96c6aa5123624fb879c0f68820a0d96a764fb960d5b7561f433ae5dad"} Mar 10 00:11:37 crc kubenswrapper[4994]: I0310 00:11:37.010137 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wpd8k" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="registry-server" probeResult="failure" output=< Mar 10 00:11:37 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:11:37 crc kubenswrapper[4994]: > Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.738438 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.828240 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0779a70e-ebf5-4e98-87ea-43017b8d1e46-serviceca\") pod \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\" (UID: \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\") " Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.828313 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zmr8\" (UniqueName: \"kubernetes.io/projected/0779a70e-ebf5-4e98-87ea-43017b8d1e46-kube-api-access-2zmr8\") pod \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\" (UID: \"0779a70e-ebf5-4e98-87ea-43017b8d1e46\") " Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.829585 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0779a70e-ebf5-4e98-87ea-43017b8d1e46-serviceca" (OuterVolumeSpecName: "serviceca") pod "0779a70e-ebf5-4e98-87ea-43017b8d1e46" (UID: "0779a70e-ebf5-4e98-87ea-43017b8d1e46"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.835714 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0779a70e-ebf5-4e98-87ea-43017b8d1e46-kube-api-access-2zmr8" (OuterVolumeSpecName: "kube-api-access-2zmr8") pod "0779a70e-ebf5-4e98-87ea-43017b8d1e46" (UID: "0779a70e-ebf5-4e98-87ea-43017b8d1e46"). InnerVolumeSpecName "kube-api-access-2zmr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.928901 4994 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0779a70e-ebf5-4e98-87ea-43017b8d1e46-serviceca\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.928937 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zmr8\" (UniqueName: \"kubernetes.io/projected/0779a70e-ebf5-4e98-87ea-43017b8d1e46-kube-api-access-2zmr8\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.966556 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29551680-sz8pz" event={"ID":"0779a70e-ebf5-4e98-87ea-43017b8d1e46","Type":"ContainerDied","Data":"b48d5517182b7a9629abeaefea2ce0d25137af9372c04a623cb57c0cb0fada84"} Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.966589 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b48d5517182b7a9629abeaefea2ce0d25137af9372c04a623cb57c0cb0fada84" Mar 10 00:11:39 crc kubenswrapper[4994]: I0310 00:11:39.966639 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29551680-sz8pz" Mar 10 00:11:42 crc kubenswrapper[4994]: I0310 00:11:42.499674 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:42 crc kubenswrapper[4994]: I0310 00:11:42.500025 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:42 crc kubenswrapper[4994]: I0310 00:11:42.500071 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:11:42 crc kubenswrapper[4994]: I0310 00:11:42.500552 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:42 crc kubenswrapper[4994]: I0310 00:11:42.500604 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"657187cd3846e2eb040a978c60bc565d95b249e43315c01390881e18407946eb"} pod="openshift-console/downloads-7954f5f757-8lrmb" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 10 00:11:42 crc kubenswrapper[4994]: I0310 00:11:42.500634 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" containerID="cri-o://657187cd3846e2eb040a978c60bc565d95b249e43315c01390881e18407946eb" gracePeriod=2 Mar 10 00:11:42 crc kubenswrapper[4994]: I0310 00:11:42.500626 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:42 crc kubenswrapper[4994]: I0310 00:11:42.501711 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:42 crc kubenswrapper[4994]: I0310 00:11:42.501737 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:44 crc kubenswrapper[4994]: I0310 00:11:44.008378 4994 generic.go:334] "Generic (PLEG): container finished" podID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerID="657187cd3846e2eb040a978c60bc565d95b249e43315c01390881e18407946eb" exitCode=0 Mar 10 00:11:44 crc kubenswrapper[4994]: I0310 00:11:44.008475 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8lrmb" event={"ID":"4fb67636-fcba-4975-a460-403cd6ee9c25","Type":"ContainerDied","Data":"657187cd3846e2eb040a978c60bc565d95b249e43315c01390881e18407946eb"} Mar 10 00:11:44 crc kubenswrapper[4994]: I0310 00:11:44.008714 4994 scope.go:117] "RemoveContainer" containerID="131f6f968be699b4510e1711ff70f7d98fd24e9b749c0ac094982fa64eb070f5" Mar 10 00:11:46 crc kubenswrapper[4994]: I0310 00:11:46.135748 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:11:46 crc kubenswrapper[4994]: I0310 00:11:46.186475 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:11:48 crc kubenswrapper[4994]: I0310 00:11:48.037267 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7qcn" event={"ID":"abe30cce-8379-4db8-838b-f48b4bc96621","Type":"ContainerStarted","Data":"ea2fcc0ccbdd2d99bdd5e8db5934d29568b2c080e467f0b26270d4267b4ac275"} Mar 10 00:11:48 crc kubenswrapper[4994]: I0310 00:11:48.039399 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8lrmb" event={"ID":"4fb67636-fcba-4975-a460-403cd6ee9c25","Type":"ContainerStarted","Data":"880d3d4f21f0a68a2d906d11b1bffa48bdffb339f9de23fd891a9fb4f152b67d"} Mar 10 00:11:48 crc kubenswrapper[4994]: I0310 00:11:48.039747 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:11:48 crc kubenswrapper[4994]: I0310 00:11:48.040256 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:48 crc kubenswrapper[4994]: I0310 00:11:48.040346 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.056316 4994 generic.go:334] "Generic (PLEG): container finished" podID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerID="8a4cdb4758a8d66ac4d964d75c363e426be3ea4f0d96bd2b4370bc01dbce1a3f" exitCode=0 Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.056438 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzrd2" event={"ID":"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c","Type":"ContainerDied","Data":"8a4cdb4758a8d66ac4d964d75c363e426be3ea4f0d96bd2b4370bc01dbce1a3f"} Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.059387 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5kj4" event={"ID":"0429fae4-1356-4d61-86a3-267f74f27636","Type":"ContainerStarted","Data":"979b74ea5a85ff4f5cee3a5418901bcd485b5c3af1206f8500f2ce239b83bc17"} Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.065813 4994 generic.go:334] "Generic (PLEG): container finished" podID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerID="6e999aa11768c350f73b98e59c5adafa0716222aea76f3c3cc4ced602c5932bf" exitCode=0 Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.065940 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zv2kt" event={"ID":"76aa065c-ed60-4237-b36f-5ce2865256ff","Type":"ContainerDied","Data":"6e999aa11768c350f73b98e59c5adafa0716222aea76f3c3cc4ced602c5932bf"} Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.070071 4994 generic.go:334] "Generic (PLEG): container finished" podID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerID="5fa6127d5cb315c05287e97611be0a26bc929ad831b3970419c02c806f804ed6" exitCode=0 Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.070146 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrh9x" event={"ID":"a4a4dc2d-502f-4c05-ab76-1cc708f13006","Type":"ContainerDied","Data":"5fa6127d5cb315c05287e97611be0a26bc929ad831b3970419c02c806f804ed6"} Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.096585 4994 generic.go:334] "Generic (PLEG): container finished" podID="abe30cce-8379-4db8-838b-f48b4bc96621" containerID="ea2fcc0ccbdd2d99bdd5e8db5934d29568b2c080e467f0b26270d4267b4ac275" exitCode=0 Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.096660 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7qcn" event={"ID":"abe30cce-8379-4db8-838b-f48b4bc96621","Type":"ContainerDied","Data":"ea2fcc0ccbdd2d99bdd5e8db5934d29568b2c080e467f0b26270d4267b4ac275"} Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.098744 4994 generic.go:334] "Generic (PLEG): container finished" podID="fdad0261-804d-41dc-8a25-48018f136c0f" containerID="2ed048e8f43bfa8a5d112e7ab569e89e26c34e5ab5b69b6c77b3a42aea54c386" exitCode=0 Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.098822 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwzk5" event={"ID":"fdad0261-804d-41dc-8a25-48018f136c0f","Type":"ContainerDied","Data":"2ed048e8f43bfa8a5d112e7ab569e89e26c34e5ab5b69b6c77b3a42aea54c386"} Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.101558 4994 generic.go:334] "Generic (PLEG): container finished" podID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerID="757e8587473ddc7f21a46feaf304e66dbe443eeebcd4628d091bf2c8bec511d9" exitCode=0 Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.102221 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4tz9" event={"ID":"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5","Type":"ContainerDied","Data":"757e8587473ddc7f21a46feaf304e66dbe443eeebcd4628d091bf2c8bec511d9"} Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.102638 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:49 crc kubenswrapper[4994]: I0310 00:11:49.102663 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.965284 4994 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.965745 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0779a70e-ebf5-4e98-87ea-43017b8d1e46" containerName="image-pruner" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.965772 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="0779a70e-ebf5-4e98-87ea-43017b8d1e46" containerName="image-pruner" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.966131 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="0779a70e-ebf5-4e98-87ea-43017b8d1e46" containerName="image-pruner" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.966841 4994 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.967029 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.967466 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b" gracePeriod=15 Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.967591 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c" gracePeriod=15 Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.967694 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0" gracePeriod=15 Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.967562 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04" gracePeriod=15 Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.967663 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5" gracePeriod=15 Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.968486 4994 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.968729 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.968755 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.968774 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.968788 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.968803 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.968816 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.968829 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.968842 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.968855 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.968867 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.968912 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.968926 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.968946 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.968960 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.968980 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.968992 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.969013 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969025 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969230 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969251 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969274 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969344 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969371 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969395 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969417 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969434 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969452 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 00:11:50 crc kubenswrapper[4994]: E0310 00:11:50.969639 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:50 crc kubenswrapper[4994]: I0310 00:11:50.969659 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.017141 4994 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.100692 4994 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.101402 4994 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.102158 4994 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.102534 4994 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.104800 4994 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.104868 4994 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.105540 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="200ms" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.106025 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.106083 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.106107 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.106135 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.106159 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.106214 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.106274 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.106324 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208396 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208505 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208543 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208583 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208609 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208621 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208683 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208706 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208760 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208741 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208997 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.209050 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208776 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.208737 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.209156 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.209316 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.306949 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="400ms" Mar 10 00:11:51 crc kubenswrapper[4994]: I0310 00:11:51.318519 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:51 crc kubenswrapper[4994]: W0310 00:11:51.359792 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-18a5e48ee962680b8c06a3e5a5396e0f805c9148a02be8cc74fe52d50407bdb6 WatchSource:0}: Error finding container 18a5e48ee962680b8c06a3e5a5396e0f805c9148a02be8cc74fe52d50407bdb6: Status 404 returned error can't find the container with id 18a5e48ee962680b8c06a3e5a5396e0f805c9148a02be8cc74fe52d50407bdb6 Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.367203 4994 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.164:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b526e27adea94 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:11:51.36601154 +0000 UTC m=+325.539718299,LastTimestamp:2026-03-10 00:11:51.36601154 +0000 UTC m=+325.539718299,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:11:51 crc kubenswrapper[4994]: E0310 00:11:51.708455 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="800ms" Mar 10 00:11:52 crc kubenswrapper[4994]: E0310 00:11:52.109499 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:11:52Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:11:52Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:11:52Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:11:52Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2ead96dfa4a455fb7a5f837dda0a6a313f0e14d9f3c87803c59b404dd20bb8c9\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4b84875eedbc5bc73b3c7db057dd8a31dc057b3bc3800c363d96061647d5542e\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1733466077},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:31f4b29550b003f4be97158173a6848610a5e1f2d75ec55a8f08e1290bea5743\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:8ce9cb8700109aeb73f9cdf0faff20b28b7b12065192e35305a606dfe03096f9\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1277641434},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:c58b039abb143f3d0ca40a35e68f945fc7e550a9768e6d1e501423c8b084cbe1\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:c87443452256f577ba60dbd9f77a070d264a5f4cd3924036fbc17c0665052272\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221223632},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.110633 4994 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.110681 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: E0310 00:11:52.110819 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: E0310 00:11:52.111116 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: E0310 00:11:52.111583 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: E0310 00:11:52.112331 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: E0310 00:11:52.112357 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.128615 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f159c5160bbebc2d55cd9bb33ea390e800dbff7ea9620c436631139ca88c6b3b"} Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.128663 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"18a5e48ee962680b8c06a3e5a5396e0f805c9148a02be8cc74fe52d50407bdb6"} Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.130451 4994 generic.go:334] "Generic (PLEG): container finished" podID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" containerID="9048ce472e16494b257b9311b0a7e6443400f3bafde71adbbc641d8d3eac2afb" exitCode=0 Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.130529 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e7e1505e-1226-47f8-8e43-6ac30a4ff867","Type":"ContainerDied","Data":"9048ce472e16494b257b9311b0a7e6443400f3bafde71adbbc641d8d3eac2afb"} Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.131281 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.131702 4994 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.133715 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.135724 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.136707 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0" exitCode=0 Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.136732 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04" exitCode=0 Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.136744 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c" exitCode=0 Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.136756 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5" exitCode=2 Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.136794 4994 scope.go:117] "RemoveContainer" containerID="4b2052cc9e93891056215aaf6b92178b89007782d39e2745048c19038a1c08f4" Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.499612 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.499685 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.499739 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: I0310 00:11:52.499764 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:11:52 crc kubenswrapper[4994]: E0310 00:11:52.509473 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="1.6s" Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.437490 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.439110 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.552413 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kubelet-dir\") pod \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.552483 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-var-lock\") pod \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.552520 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e7e1505e-1226-47f8-8e43-6ac30a4ff867" (UID: "e7e1505e-1226-47f8-8e43-6ac30a4ff867"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.552559 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-var-lock" (OuterVolumeSpecName: "var-lock") pod "e7e1505e-1226-47f8-8e43-6ac30a4ff867" (UID: "e7e1505e-1226-47f8-8e43-6ac30a4ff867"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.552613 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kube-api-access\") pod \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\" (UID: \"e7e1505e-1226-47f8-8e43-6ac30a4ff867\") " Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.552825 4994 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.552841 4994 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7e1505e-1226-47f8-8e43-6ac30a4ff867-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.560125 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e1505e-1226-47f8-8e43-6ac30a4ff867" (UID: "e7e1505e-1226-47f8-8e43-6ac30a4ff867"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:11:53 crc kubenswrapper[4994]: I0310 00:11:53.653571 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e1505e-1226-47f8-8e43-6ac30a4ff867-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:54 crc kubenswrapper[4994]: E0310 00:11:54.109962 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="3.2s" Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.158679 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.160524 4994 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b" exitCode=0 Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.162642 4994 generic.go:334] "Generic (PLEG): container finished" podID="0429fae4-1356-4d61-86a3-267f74f27636" containerID="979b74ea5a85ff4f5cee3a5418901bcd485b5c3af1206f8500f2ce239b83bc17" exitCode=0 Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.162701 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5kj4" event={"ID":"0429fae4-1356-4d61-86a3-267f74f27636","Type":"ContainerDied","Data":"979b74ea5a85ff4f5cee3a5418901bcd485b5c3af1206f8500f2ce239b83bc17"} Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.164465 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.164643 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.164707 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e7e1505e-1226-47f8-8e43-6ac30a4ff867","Type":"ContainerDied","Data":"929c6f36bcb8cd761948647dfebae867393436e978350669dcfa51d6e35aaf48"} Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.164768 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="929c6f36bcb8cd761948647dfebae867393436e978350669dcfa51d6e35aaf48" Mar 10 00:11:54 crc kubenswrapper[4994]: E0310 00:11:54.165113 4994 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.169037 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.170450 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.171728 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.192536 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:54 crc kubenswrapper[4994]: I0310 00:11:54.192939 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:54 crc kubenswrapper[4994]: E0310 00:11:54.651260 4994 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" volumeName="registry-storage" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.176346 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.177184 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c46f93fa4d73cf6270f364f11272df6b45c18596808880a68c78370736f59ee" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.198813 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.199506 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.200027 4994 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.200208 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.200458 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.281401 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.281556 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.281610 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.281945 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.281988 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.282009 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.383215 4994 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.383249 4994 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:55 crc kubenswrapper[4994]: I0310 00:11:55.383258 4994 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:56 crc kubenswrapper[4994]: I0310 00:11:56.185539 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:56 crc kubenswrapper[4994]: I0310 00:11:56.211850 4994 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:56 crc kubenswrapper[4994]: I0310 00:11:56.212405 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:56 crc kubenswrapper[4994]: I0310 00:11:56.212691 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:56 crc kubenswrapper[4994]: I0310 00:11:56.558780 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:56 crc kubenswrapper[4994]: I0310 00:11:56.560188 4994 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:56 crc kubenswrapper[4994]: I0310 00:11:56.560567 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:11:56 crc kubenswrapper[4994]: I0310 00:11:56.586852 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 10 00:11:57 crc kubenswrapper[4994]: E0310 00:11:57.310863 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="6.4s" Mar 10 00:11:57 crc kubenswrapper[4994]: E0310 00:11:57.690848 4994 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.164:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b526e27adea94 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:11:51.36601154 +0000 UTC m=+325.539718299,LastTimestamp:2026-03-10 00:11:51.36601154 +0000 UTC m=+325.539718299,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:12:02 crc kubenswrapper[4994]: E0310 00:12:02.207434 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:12:02Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:12:02Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:12:02Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:12:02Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2ead96dfa4a455fb7a5f837dda0a6a313f0e14d9f3c87803c59b404dd20bb8c9\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4b84875eedbc5bc73b3c7db057dd8a31dc057b3bc3800c363d96061647d5542e\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1733466077},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:31f4b29550b003f4be97158173a6848610a5e1f2d75ec55a8f08e1290bea5743\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:8ce9cb8700109aeb73f9cdf0faff20b28b7b12065192e35305a606dfe03096f9\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1277641434},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:c58b039abb143f3d0ca40a35e68f945fc7e550a9768e6d1e501423c8b084cbe1\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:c87443452256f577ba60dbd9f77a070d264a5f4cd3924036fbc17c0665052272\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221223632},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:02 crc kubenswrapper[4994]: E0310 00:12:02.209199 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:02 crc kubenswrapper[4994]: E0310 00:12:02.209567 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:02 crc kubenswrapper[4994]: E0310 00:12:02.209938 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:02 crc kubenswrapper[4994]: E0310 00:12:02.210257 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:02 crc kubenswrapper[4994]: E0310 00:12:02.210286 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:12:02 crc kubenswrapper[4994]: I0310 00:12:02.499173 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:12:02 crc kubenswrapper[4994]: I0310 00:12:02.499255 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:12:02 crc kubenswrapper[4994]: I0310 00:12:02.499336 4994 patch_prober.go:28] interesting pod/downloads-7954f5f757-8lrmb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 10 00:12:02 crc kubenswrapper[4994]: I0310 00:12:02.499446 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8lrmb" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 10 00:12:03 crc kubenswrapper[4994]: E0310 00:12:03.711782 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="7s" Mar 10 00:12:04 crc kubenswrapper[4994]: I0310 00:12:04.553845 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:04 crc kubenswrapper[4994]: I0310 00:12:04.555678 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:04 crc kubenswrapper[4994]: I0310 00:12:04.556521 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:04 crc kubenswrapper[4994]: I0310 00:12:04.570580 4994 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:04 crc kubenswrapper[4994]: I0310 00:12:04.570788 4994 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:04 crc kubenswrapper[4994]: E0310 00:12:04.572379 4994 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:04 crc kubenswrapper[4994]: I0310 00:12:04.573331 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:05 crc kubenswrapper[4994]: I0310 00:12:05.966138 4994 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 10 00:12:05 crc kubenswrapper[4994]: I0310 00:12:05.966461 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 10 00:12:06 crc kubenswrapper[4994]: I0310 00:12:06.561644 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:06 crc kubenswrapper[4994]: I0310 00:12:06.562287 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:06 crc kubenswrapper[4994]: I0310 00:12:06.562759 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:07 crc kubenswrapper[4994]: I0310 00:12:07.262041 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 00:12:07 crc kubenswrapper[4994]: I0310 00:12:07.262793 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 00:12:07 crc kubenswrapper[4994]: I0310 00:12:07.262863 4994 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6" exitCode=1 Mar 10 00:12:07 crc kubenswrapper[4994]: I0310 00:12:07.262931 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6"} Mar 10 00:12:07 crc kubenswrapper[4994]: I0310 00:12:07.263651 4994 scope.go:117] "RemoveContainer" containerID="597b2c3115f29618ea3e6b294f965e82d2c20cb7d2696bbaf686f37aacb920d6" Mar 10 00:12:07 crc kubenswrapper[4994]: I0310 00:12:07.263710 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:07 crc kubenswrapper[4994]: I0310 00:12:07.264175 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:07 crc kubenswrapper[4994]: I0310 00:12:07.264736 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:07 crc kubenswrapper[4994]: I0310 00:12:07.265271 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:07 crc kubenswrapper[4994]: E0310 00:12:07.692674 4994 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.164:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b526e27adea94 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:11:51.36601154 +0000 UTC m=+325.539718299,LastTimestamp:2026-03-10 00:11:51.36601154 +0000 UTC m=+325.539718299,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:12:09 crc kubenswrapper[4994]: I0310 00:12:09.330303 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4tz9" event={"ID":"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5","Type":"ContainerStarted","Data":"c85cb3939718589608e5c84bc5f793f0ac91554e53660eb93bb8216dc7e11be6"} Mar 10 00:12:09 crc kubenswrapper[4994]: I0310 00:12:09.333130 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:09 crc kubenswrapper[4994]: I0310 00:12:09.333638 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:09 crc kubenswrapper[4994]: I0310 00:12:09.334104 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:09 crc kubenswrapper[4994]: I0310 00:12:09.334605 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:09 crc kubenswrapper[4994]: I0310 00:12:09.335055 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:09 crc kubenswrapper[4994]: I0310 00:12:09.799795 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:12:10 crc kubenswrapper[4994]: E0310 00:12:10.713477 4994 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="7s" Mar 10 00:12:12 crc kubenswrapper[4994]: E0310 00:12:12.293585 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:12:12Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:12:12Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:12:12Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:12:12Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:2ead96dfa4a455fb7a5f837dda0a6a313f0e14d9f3c87803c59b404dd20bb8c9\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4b84875eedbc5bc73b3c7db057dd8a31dc057b3bc3800c363d96061647d5542e\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1733466077},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:31f4b29550b003f4be97158173a6848610a5e1f2d75ec55a8f08e1290bea5743\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:8ce9cb8700109aeb73f9cdf0faff20b28b7b12065192e35305a606dfe03096f9\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1277641434},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:c58b039abb143f3d0ca40a35e68f945fc7e550a9768e6d1e501423c8b084cbe1\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:c87443452256f577ba60dbd9f77a070d264a5f4cd3924036fbc17c0665052272\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221223632},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: E0310 00:12:12.294806 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: E0310 00:12:12.295314 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: E0310 00:12:12.295735 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: E0310 00:12:12.296184 4994 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: E0310 00:12:12.296216 4994 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:12:12 crc kubenswrapper[4994]: W0310 00:12:12.514771 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-7979a6518c99bce5683dee67e99778ebe73e569747a5271d1eb921be7fb9dd47 WatchSource:0}: Error finding container 7979a6518c99bce5683dee67e99778ebe73e569747a5271d1eb921be7fb9dd47: Status 404 returned error can't find the container with id 7979a6518c99bce5683dee67e99778ebe73e569747a5271d1eb921be7fb9dd47 Mar 10 00:12:12 crc kubenswrapper[4994]: I0310 00:12:12.518861 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-8lrmb" Mar 10 00:12:12 crc kubenswrapper[4994]: I0310 00:12:12.519625 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: I0310 00:12:12.520221 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: I0310 00:12:12.520650 4994 status_manager.go:851] "Failed to get status for pod" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" pod="openshift-console/downloads-7954f5f757-8lrmb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/downloads-7954f5f757-8lrmb\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: I0310 00:12:12.521312 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: I0310 00:12:12.523157 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:12 crc kubenswrapper[4994]: I0310 00:12:12.523776 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.169779 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.169842 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.239604 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.240705 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.241189 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.241812 4994 status_manager.go:851] "Failed to get status for pod" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" pod="openshift-console/downloads-7954f5f757-8lrmb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/downloads-7954f5f757-8lrmb\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.242345 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.242775 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.243263 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.363186 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zv2kt" event={"ID":"76aa065c-ed60-4237-b36f-5ce2865256ff","Type":"ContainerStarted","Data":"ed0179eebea68ea16a7ca2db77939a49213c1a39d735deda526e14363858855f"} Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.366024 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrh9x" event={"ID":"a4a4dc2d-502f-4c05-ab76-1cc708f13006","Type":"ContainerStarted","Data":"0872f65b4051bd51ddef4f02f90e17a29e60ddaf92da0bcd501b161644707129"} Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.367253 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.367617 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.367974 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.368587 4994 status_manager.go:851] "Failed to get status for pod" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" pod="openshift-marketplace/redhat-marketplace-hrh9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrh9x\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.369142 4994 status_manager.go:851] "Failed to get status for pod" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" pod="openshift-console/downloads-7954f5f757-8lrmb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/downloads-7954f5f757-8lrmb\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.369625 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.369891 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.370426 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7qcn" event={"ID":"abe30cce-8379-4db8-838b-f48b4bc96621","Type":"ContainerStarted","Data":"06150f7e8e87fb17f0464494725fda3c7ecd07c48589662349d651d3700f6139"} Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.370830 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.370922 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.370985 4994 status_manager.go:851] "Failed to get status for pod" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" pod="openshift-console/downloads-7954f5f757-8lrmb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/downloads-7954f5f757-8lrmb\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.371333 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.371606 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.371858 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.372148 4994 status_manager.go:851] "Failed to get status for pod" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" pod="openshift-marketplace/community-operators-s7qcn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s7qcn\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.372433 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.372776 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.373093 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwzk5" event={"ID":"fdad0261-804d-41dc-8a25-48018f136c0f","Type":"ContainerStarted","Data":"8050e9fca1f15bb15fdfd5da3939a33eede922d4e417035502ed98660f52b965"} Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.373262 4994 status_manager.go:851] "Failed to get status for pod" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" pod="openshift-marketplace/redhat-marketplace-hrh9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrh9x\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.397936 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.399335 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.399799 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4775d9148386aa6d7bcab367446c2763501cb2fb0bab1d51b2917349e4a84821"} Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.403036 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.403995 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.404529 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3e49e18983f0b96a449876dfb330dd4ce7814743360f0e56be0661919de9264d"} Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.404623 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7979a6518c99bce5683dee67e99778ebe73e569747a5271d1eb921be7fb9dd47"} Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.404764 4994 status_manager.go:851] "Failed to get status for pod" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" pod="openshift-marketplace/redhat-marketplace-hrh9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrh9x\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.404999 4994 status_manager.go:851] "Failed to get status for pod" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" pod="openshift-console/downloads-7954f5f757-8lrmb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/downloads-7954f5f757-8lrmb\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.405261 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.405607 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.405974 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.406278 4994 status_manager.go:851] "Failed to get status for pod" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" pod="openshift-marketplace/community-operators-s7qcn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s7qcn\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.407323 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzrd2" event={"ID":"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c","Type":"ContainerStarted","Data":"4ac24c0285d02489c6991b060b56e4de350d7660656ffb13b6c269d046020cdf"} Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.409684 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5kj4" event={"ID":"0429fae4-1356-4d61-86a3-267f74f27636","Type":"ContainerStarted","Data":"ae53ee029113a53cb7678f130696b300909ff84d74bb704709a978c1c97a24b4"} Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.410550 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.410824 4994 status_manager.go:851] "Failed to get status for pod" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" pod="openshift-marketplace/community-operators-s7qcn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s7qcn\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.413989 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.414221 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.414389 4994 status_manager.go:851] "Failed to get status for pod" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" pod="openshift-marketplace/redhat-marketplace-hrh9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrh9x\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.414555 4994 status_manager.go:851] "Failed to get status for pod" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" pod="openshift-console/downloads-7954f5f757-8lrmb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/downloads-7954f5f757-8lrmb\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.414728 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.414923 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.459606 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.460326 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.460723 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.461345 4994 status_manager.go:851] "Failed to get status for pod" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" pod="openshift-marketplace/redhat-marketplace-hrh9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrh9x\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.461703 4994 status_manager.go:851] "Failed to get status for pod" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" pod="openshift-console/downloads-7954f5f757-8lrmb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/downloads-7954f5f757-8lrmb\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.461988 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.462271 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.462543 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.462845 4994 status_manager.go:851] "Failed to get status for pod" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" pod="openshift-marketplace/community-operators-s7qcn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s7qcn\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:13 crc kubenswrapper[4994]: I0310 00:12:13.785685 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.417549 4994 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3e49e18983f0b96a449876dfb330dd4ce7814743360f0e56be0661919de9264d" exitCode=0 Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.417592 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3e49e18983f0b96a449876dfb330dd4ce7814743360f0e56be0661919de9264d"} Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.417853 4994 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.418287 4994 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.418158 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: E0310 00:12:14.418752 4994 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.418762 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.419152 4994 status_manager.go:851] "Failed to get status for pod" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" pod="openshift-marketplace/redhat-marketplace-hrh9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrh9x\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.419365 4994 status_manager.go:851] "Failed to get status for pod" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" pod="openshift-console/downloads-7954f5f757-8lrmb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/downloads-7954f5f757-8lrmb\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.419618 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.420180 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.420534 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.421054 4994 status_manager.go:851] "Failed to get status for pod" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" pod="openshift-marketplace/community-operators-s7qcn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s7qcn\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.421442 4994 status_manager.go:851] "Failed to get status for pod" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" pod="openshift-marketplace/community-operators-zv2kt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zv2kt\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.421707 4994 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.422068 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-s7qcn" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" containerName="registry-server" probeResult="failure" output=< Mar 10 00:12:14 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:12:14 crc kubenswrapper[4994]: > Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.422100 4994 status_manager.go:851] "Failed to get status for pod" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.422611 4994 status_manager.go:851] "Failed to get status for pod" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" pod="openshift-marketplace/community-operators-s7qcn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-s7qcn\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.422914 4994 status_manager.go:851] "Failed to get status for pod" podUID="0429fae4-1356-4d61-86a3-267f74f27636" pod="openshift-marketplace/redhat-operators-t5kj4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-t5kj4\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.423188 4994 status_manager.go:851] "Failed to get status for pod" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" pod="openshift-marketplace/certified-operators-c4tz9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-c4tz9\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.423460 4994 status_manager.go:851] "Failed to get status for pod" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" pod="openshift-marketplace/redhat-marketplace-hrh9x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hrh9x\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.423775 4994 status_manager.go:851] "Failed to get status for pod" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" pod="openshift-marketplace/certified-operators-bwzk5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bwzk5\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.424123 4994 status_manager.go:851] "Failed to get status for pod" podUID="4fb67636-fcba-4975-a460-403cd6ee9c25" pod="openshift-console/downloads-7954f5f757-8lrmb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/downloads-7954f5f757-8lrmb\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.424398 4994 status_manager.go:851] "Failed to get status for pod" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" pod="openshift-marketplace/redhat-marketplace-bzrd2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bzrd2\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:14 crc kubenswrapper[4994]: I0310 00:12:14.424725 4994 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 10 00:12:15 crc kubenswrapper[4994]: I0310 00:12:15.007486 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:12:15 crc kubenswrapper[4994]: I0310 00:12:15.007911 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:12:15 crc kubenswrapper[4994]: I0310 00:12:15.381359 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:12:15 crc kubenswrapper[4994]: I0310 00:12:15.381467 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:12:16 crc kubenswrapper[4994]: I0310 00:12:16.047167 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-hrh9x" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerName="registry-server" probeResult="failure" output=< Mar 10 00:12:16 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:12:16 crc kubenswrapper[4994]: > Mar 10 00:12:16 crc kubenswrapper[4994]: I0310 00:12:16.356206 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:12:16 crc kubenswrapper[4994]: I0310 00:12:16.356322 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:12:16 crc kubenswrapper[4994]: I0310 00:12:16.448818 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-bzrd2" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerName="registry-server" probeResult="failure" output=< Mar 10 00:12:16 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:12:16 crc kubenswrapper[4994]: > Mar 10 00:12:16 crc kubenswrapper[4994]: I0310 00:12:16.454749 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5aeba6f8d1fc3d56c8e40a470bacd9595ee0ce5128539e66674f577f8cf699d5"} Mar 10 00:12:17 crc kubenswrapper[4994]: I0310 00:12:17.422354 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t5kj4" podUID="0429fae4-1356-4d61-86a3-267f74f27636" containerName="registry-server" probeResult="failure" output=< Mar 10 00:12:17 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:12:17 crc kubenswrapper[4994]: > Mar 10 00:12:18 crc kubenswrapper[4994]: I0310 00:12:18.470734 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a5c7ed696570aa0f3b3a821017b75c4e650df498798189c2c3a5989dfb0673a1"} Mar 10 00:12:19 crc kubenswrapper[4994]: I0310 00:12:19.799449 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:12:19 crc kubenswrapper[4994]: I0310 00:12:19.804915 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:12:20 crc kubenswrapper[4994]: I0310 00:12:20.491055 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d8ac9c96642120235e24de181f59530c330fdc1abb013c500614df8f311a6af1"} Mar 10 00:12:22 crc kubenswrapper[4994]: I0310 00:12:22.511504 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7681f4b847501f8824d526a1f4e3d9f91788f9203b77c819a471dc9828d9c67f"} Mar 10 00:12:22 crc kubenswrapper[4994]: I0310 00:12:22.758618 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:12:22 crc kubenswrapper[4994]: I0310 00:12:22.758718 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:12:22 crc kubenswrapper[4994]: I0310 00:12:22.836472 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:12:22 crc kubenswrapper[4994]: I0310 00:12:22.999473 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:12:22 crc kubenswrapper[4994]: I0310 00:12:22.999563 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.086727 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.423599 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.492267 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.525570 4994 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.525613 4994 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.525903 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a276672f8df37350ac03bef30c33aaff113a566814c0b2b9e9da731500296641"} Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.527511 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.535774 4994 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.544776 4994 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8b1f5a0-5549-4cd1-8069-1471104b78b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5aeba6f8d1fc3d56c8e40a470bacd9595ee0ce5128539e66674f577f8cf699d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:12:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8ac9c96642120235e24de181f59530c330fdc1abb013c500614df8f311a6af1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:12:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c7ed696570aa0f3b3a821017b75c4e650df498798189c2c3a5989dfb0673a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:12:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a276672f8df37350ac03bef30c33aaff113a566814c0b2b9e9da731500296641\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:12:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7681f4b847501f8824d526a1f4e3d9f91788f9203b77c819a471dc9828d9c67f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:12:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": pods \"kube-apiserver-crc\" not found" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.587783 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.595167 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.611774 4994 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e132a8ed-7c07-4ab3-8467-4370b175763c" Mar 10 00:12:23 crc kubenswrapper[4994]: I0310 00:12:23.793357 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:12:24 crc kubenswrapper[4994]: I0310 00:12:24.542850 4994 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:24 crc kubenswrapper[4994]: I0310 00:12:24.543223 4994 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:24 crc kubenswrapper[4994]: I0310 00:12:24.573454 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:24 crc kubenswrapper[4994]: I0310 00:12:24.573521 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:24 crc kubenswrapper[4994]: I0310 00:12:24.580787 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:25 crc kubenswrapper[4994]: I0310 00:12:25.075083 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:12:25 crc kubenswrapper[4994]: I0310 00:12:25.147772 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:12:25 crc kubenswrapper[4994]: I0310 00:12:25.450503 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:12:25 crc kubenswrapper[4994]: I0310 00:12:25.525497 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:12:25 crc kubenswrapper[4994]: I0310 00:12:25.551411 4994 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:25 crc kubenswrapper[4994]: I0310 00:12:25.551455 4994 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:25 crc kubenswrapper[4994]: I0310 00:12:25.562528 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:25 crc kubenswrapper[4994]: I0310 00:12:25.567388 4994 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e132a8ed-7c07-4ab3-8467-4370b175763c" Mar 10 00:12:26 crc kubenswrapper[4994]: I0310 00:12:26.422244 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:12:26 crc kubenswrapper[4994]: I0310 00:12:26.490043 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:12:26 crc kubenswrapper[4994]: I0310 00:12:26.567099 4994 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:26 crc kubenswrapper[4994]: I0310 00:12:26.567133 4994 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:27 crc kubenswrapper[4994]: I0310 00:12:27.568340 4994 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:27 crc kubenswrapper[4994]: I0310 00:12:27.568381 4994 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:34 crc kubenswrapper[4994]: I0310 00:12:34.583972 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:12:34 crc kubenswrapper[4994]: I0310 00:12:34.585398 4994 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:34 crc kubenswrapper[4994]: I0310 00:12:34.585432 4994 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e8b1f5a0-5549-4cd1-8069-1471104b78b5" Mar 10 00:12:36 crc kubenswrapper[4994]: I0310 00:12:36.600315 4994 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e132a8ed-7c07-4ab3-8467-4370b175763c" Mar 10 00:12:50 crc kubenswrapper[4994]: I0310 00:12:50.523901 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 00:12:51 crc kubenswrapper[4994]: I0310 00:12:51.243363 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 00:12:52 crc kubenswrapper[4994]: I0310 00:12:52.230657 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 00:12:52 crc kubenswrapper[4994]: I0310 00:12:52.269775 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 00:12:52 crc kubenswrapper[4994]: I0310 00:12:52.611463 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 00:12:53 crc kubenswrapper[4994]: I0310 00:12:53.824900 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 00:12:53 crc kubenswrapper[4994]: I0310 00:12:53.922173 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 00:12:54 crc kubenswrapper[4994]: I0310 00:12:54.066629 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 00:12:54 crc kubenswrapper[4994]: I0310 00:12:54.647824 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 00:12:54 crc kubenswrapper[4994]: I0310 00:12:54.758540 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 00:12:54 crc kubenswrapper[4994]: I0310 00:12:54.839584 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 00:12:55 crc kubenswrapper[4994]: I0310 00:12:55.122730 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 00:12:55 crc kubenswrapper[4994]: I0310 00:12:55.529442 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 00:12:55 crc kubenswrapper[4994]: I0310 00:12:55.581338 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 00:12:55 crc kubenswrapper[4994]: I0310 00:12:55.654593 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 00:12:56 crc kubenswrapper[4994]: I0310 00:12:56.024750 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 00:12:56 crc kubenswrapper[4994]: I0310 00:12:56.525347 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 00:12:56 crc kubenswrapper[4994]: I0310 00:12:56.633186 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 00:12:56 crc kubenswrapper[4994]: I0310 00:12:56.698488 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 00:12:56 crc kubenswrapper[4994]: I0310 00:12:56.785272 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 00:12:56 crc kubenswrapper[4994]: I0310 00:12:56.820447 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.138482 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.231972 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.337524 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.372518 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.383534 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.688425 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.700832 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.800928 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.838382 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.838719 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 00:12:57 crc kubenswrapper[4994]: I0310 00:12:57.913495 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 00:12:58 crc kubenswrapper[4994]: I0310 00:12:58.031531 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 00:12:58 crc kubenswrapper[4994]: I0310 00:12:58.311062 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 00:12:58 crc kubenswrapper[4994]: I0310 00:12:58.312369 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 00:12:58 crc kubenswrapper[4994]: I0310 00:12:58.383008 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 00:12:58 crc kubenswrapper[4994]: I0310 00:12:58.770630 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 00:12:58 crc kubenswrapper[4994]: I0310 00:12:58.835420 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 00:12:58 crc kubenswrapper[4994]: I0310 00:12:58.840919 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 00:12:58 crc kubenswrapper[4994]: I0310 00:12:58.920043 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 00:12:59 crc kubenswrapper[4994]: I0310 00:12:59.279974 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 00:12:59 crc kubenswrapper[4994]: I0310 00:12:59.448837 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 00:12:59 crc kubenswrapper[4994]: I0310 00:12:59.607166 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 00:12:59 crc kubenswrapper[4994]: I0310 00:12:59.779028 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 00:12:59 crc kubenswrapper[4994]: I0310 00:12:59.825060 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.073992 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.139740 4994 generic.go:334] "Generic (PLEG): container finished" podID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerID="8746dfac88fe15ceee6b052e166cd28bfb74ffbe54cfe6f00bf09d8a12e889fd" exitCode=0 Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.139809 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" event={"ID":"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd","Type":"ContainerDied","Data":"8746dfac88fe15ceee6b052e166cd28bfb74ffbe54cfe6f00bf09d8a12e889fd"} Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.140517 4994 scope.go:117] "RemoveContainer" containerID="8746dfac88fe15ceee6b052e166cd28bfb74ffbe54cfe6f00bf09d8a12e889fd" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.218072 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.370050 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.461102 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.521280 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.559166 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.649999 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.756778 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 00:13:00 crc kubenswrapper[4994]: I0310 00:13:00.844505 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.012727 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.088494 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.156169 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-tgf68_b85bbdaa-daa8-4c69-abf9-9f1200eb07cd/marketplace-operator/1.log" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.156895 4994 generic.go:334] "Generic (PLEG): container finished" podID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerID="7bce6eb584bb3c8177c72c93b8bc60cfd8583faf52e9d74e3b32955e8a8583bd" exitCode=1 Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.156942 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" event={"ID":"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd","Type":"ContainerDied","Data":"7bce6eb584bb3c8177c72c93b8bc60cfd8583faf52e9d74e3b32955e8a8583bd"} Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.157234 4994 scope.go:117] "RemoveContainer" containerID="8746dfac88fe15ceee6b052e166cd28bfb74ffbe54cfe6f00bf09d8a12e889fd" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.157978 4994 scope.go:117] "RemoveContainer" containerID="7bce6eb584bb3c8177c72c93b8bc60cfd8583faf52e9d74e3b32955e8a8583bd" Mar 10 00:13:01 crc kubenswrapper[4994]: E0310 00:13:01.158471 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-tgf68_openshift-marketplace(b85bbdaa-daa8-4c69-abf9-9f1200eb07cd)\"" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.186199 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.222995 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.224746 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.259190 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.292403 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.517622 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.536348 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.926314 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.947741 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 00:13:01 crc kubenswrapper[4994]: I0310 00:13:01.966231 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.061025 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.061095 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.169371 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-tgf68_b85bbdaa-daa8-4c69-abf9-9f1200eb07cd/marketplace-operator/1.log" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.170103 4994 scope.go:117] "RemoveContainer" containerID="7bce6eb584bb3c8177c72c93b8bc60cfd8583faf52e9d74e3b32955e8a8583bd" Mar 10 00:13:02 crc kubenswrapper[4994]: E0310 00:13:02.170408 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-tgf68_openshift-marketplace(b85bbdaa-daa8-4c69-abf9-9f1200eb07cd)\"" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.283312 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.322807 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.483220 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.537458 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.817565 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 00:13:02 crc kubenswrapper[4994]: I0310 00:13:02.974867 4994 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.122193 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.129992 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.173700 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.469023 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.495866 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.621363 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.677069 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.758705 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.811183 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.868307 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.902269 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.927663 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 00:13:03 crc kubenswrapper[4994]: I0310 00:13:03.957199 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.076527 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.101746 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.147917 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.165047 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.259652 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.265560 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.401834 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.447342 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.472362 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.485072 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.555656 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.595488 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.743915 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 00:13:04 crc kubenswrapper[4994]: I0310 00:13:04.921386 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 00:13:05 crc kubenswrapper[4994]: I0310 00:13:05.425468 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 00:13:05 crc kubenswrapper[4994]: I0310 00:13:05.440019 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 00:13:05 crc kubenswrapper[4994]: I0310 00:13:05.803944 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 00:13:05 crc kubenswrapper[4994]: I0310 00:13:05.836784 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 00:13:05 crc kubenswrapper[4994]: I0310 00:13:05.890751 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.061064 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.158475 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.219922 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.263203 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.289397 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.335867 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.380653 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.546335 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.569657 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.611205 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.765408 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.793951 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 00:13:06 crc kubenswrapper[4994]: I0310 00:13:06.916116 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 00:13:07 crc kubenswrapper[4994]: I0310 00:13:07.151931 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 00:13:07 crc kubenswrapper[4994]: I0310 00:13:07.453656 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 00:13:07 crc kubenswrapper[4994]: I0310 00:13:07.485207 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 00:13:07 crc kubenswrapper[4994]: I0310 00:13:07.528042 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 00:13:07 crc kubenswrapper[4994]: I0310 00:13:07.567856 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 00:13:07 crc kubenswrapper[4994]: I0310 00:13:07.622600 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 00:13:07 crc kubenswrapper[4994]: I0310 00:13:07.669567 4994 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 00:13:07 crc kubenswrapper[4994]: I0310 00:13:07.691711 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 00:13:07 crc kubenswrapper[4994]: I0310 00:13:07.886385 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 00:13:08 crc kubenswrapper[4994]: I0310 00:13:08.138511 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 00:13:08 crc kubenswrapper[4994]: I0310 00:13:08.155823 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 00:13:08 crc kubenswrapper[4994]: I0310 00:13:08.209592 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 00:13:08 crc kubenswrapper[4994]: I0310 00:13:08.377219 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 00:13:08 crc kubenswrapper[4994]: I0310 00:13:08.489571 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 00:13:08 crc kubenswrapper[4994]: I0310 00:13:08.604903 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 00:13:08 crc kubenswrapper[4994]: I0310 00:13:08.692032 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 00:13:08 crc kubenswrapper[4994]: I0310 00:13:08.867049 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.049977 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.116748 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.188307 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.257194 4994 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.321531 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.332985 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.383306 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.554935 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.612648 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.622702 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.749565 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.773971 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 00:13:09 crc kubenswrapper[4994]: I0310 00:13:09.818133 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.029074 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.074567 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.125502 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.197005 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.220300 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.224533 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.263509 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.324983 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.325274 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.453619 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.466249 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.473405 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.585654 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.891192 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 00:13:10 crc kubenswrapper[4994]: I0310 00:13:10.970558 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.198320 4994 scope.go:117] "RemoveContainer" containerID="ba768d801b1af483294cc4c83db2f2fed09e29bf2538636a9ef1b5b919e82e04" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.222530 4994 scope.go:117] "RemoveContainer" containerID="5d8d9d895935ea3a9a93164c807fbe78cb08b23d1197887ef1bc448c133ab25b" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.246985 4994 scope.go:117] "RemoveContainer" containerID="71ae18aa606f50762bbd16146190406e6cc5901c2a4c6966e159543a1034e54c" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.267330 4994 scope.go:117] "RemoveContainer" containerID="ca5e9d7de2d25818126326db76b1bede4a34f8da38d19efa98cccc184f8b98b5" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.289004 4994 scope.go:117] "RemoveContainer" containerID="b28b834fabb0fc5d1f7dfc8ea8a298c8c1d3868046105304eb5d3cc158bdbd80" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.301552 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.389975 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.507719 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.702245 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.716352 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.723186 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.793656 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.836468 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 00:13:11 crc kubenswrapper[4994]: I0310 00:13:11.941234 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.014231 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.194212 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.273398 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.276241 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.364372 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.452480 4994 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.478328 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.790141 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.816516 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.816586 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 00:13:12 crc kubenswrapper[4994]: I0310 00:13:12.926829 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 00:13:13 crc kubenswrapper[4994]: I0310 00:13:13.045229 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 00:13:13 crc kubenswrapper[4994]: I0310 00:13:13.068744 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 00:13:13 crc kubenswrapper[4994]: I0310 00:13:13.307711 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 00:13:13 crc kubenswrapper[4994]: I0310 00:13:13.373497 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 00:13:13 crc kubenswrapper[4994]: I0310 00:13:13.392038 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 00:13:13 crc kubenswrapper[4994]: I0310 00:13:13.396360 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 00:13:13 crc kubenswrapper[4994]: I0310 00:13:13.693933 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.034949 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.059345 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.379421 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.407784 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.554290 4994 scope.go:117] "RemoveContainer" containerID="7bce6eb584bb3c8177c72c93b8bc60cfd8583faf52e9d74e3b32955e8a8583bd" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.585606 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.661101 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.727226 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.727293 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.736117 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.892685 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 00:13:14 crc kubenswrapper[4994]: I0310 00:13:14.928035 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.011361 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.155123 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.270418 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-tgf68_b85bbdaa-daa8-4c69-abf9-9f1200eb07cd/marketplace-operator/2.log" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.271224 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-tgf68_b85bbdaa-daa8-4c69-abf9-9f1200eb07cd/marketplace-operator/1.log" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.271289 4994 generic.go:334] "Generic (PLEG): container finished" podID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerID="e7fca198849ed5918e32d90439680c268c9d2e25eff70cc1e2ce92503bc67d85" exitCode=1 Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.271328 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" event={"ID":"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd","Type":"ContainerDied","Data":"e7fca198849ed5918e32d90439680c268c9d2e25eff70cc1e2ce92503bc67d85"} Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.271383 4994 scope.go:117] "RemoveContainer" containerID="7bce6eb584bb3c8177c72c93b8bc60cfd8583faf52e9d74e3b32955e8a8583bd" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.273410 4994 scope.go:117] "RemoveContainer" containerID="e7fca198849ed5918e32d90439680c268c9d2e25eff70cc1e2ce92503bc67d85" Mar 10 00:13:15 crc kubenswrapper[4994]: E0310 00:13:15.274419 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-tgf68_openshift-marketplace(b85bbdaa-daa8-4c69-abf9-9f1200eb07cd)\"" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.404816 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.482335 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.489241 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.540134 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.735362 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.778612 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 00:13:15 crc kubenswrapper[4994]: I0310 00:13:15.871540 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 00:13:16 crc kubenswrapper[4994]: I0310 00:13:16.281611 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-tgf68_b85bbdaa-daa8-4c69-abf9-9f1200eb07cd/marketplace-operator/2.log" Mar 10 00:13:16 crc kubenswrapper[4994]: I0310 00:13:16.347811 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 00:13:16 crc kubenswrapper[4994]: I0310 00:13:16.392970 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 00:13:16 crc kubenswrapper[4994]: I0310 00:13:16.531987 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 00:13:16 crc kubenswrapper[4994]: I0310 00:13:16.627511 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 00:13:16 crc kubenswrapper[4994]: I0310 00:13:16.660250 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 00:13:16 crc kubenswrapper[4994]: I0310 00:13:16.748926 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.102178 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.160809 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.195214 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.399830 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.525559 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.528889 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.690603 4994 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.691388 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c4tz9" podStartSLOduration=87.662213091 podStartE2EDuration="3m15.691362194s" podCreationTimestamp="2026-03-10 00:10:02 +0000 UTC" firstStartedPulling="2026-03-10 00:10:12.428679297 +0000 UTC m=+226.602386046" lastFinishedPulling="2026-03-10 00:12:00.45782837 +0000 UTC m=+334.631535149" observedRunningTime="2026-03-10 00:12:23.201989372 +0000 UTC m=+357.375696131" watchObservedRunningTime="2026-03-10 00:13:17.691362194 +0000 UTC m=+411.865068983" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.691557 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bzrd2" podStartSLOduration=73.967708237 podStartE2EDuration="3m12.691549919s" podCreationTimestamp="2026-03-10 00:10:05 +0000 UTC" firstStartedPulling="2026-03-10 00:10:12.461758126 +0000 UTC m=+226.635464875" lastFinishedPulling="2026-03-10 00:12:11.185599768 +0000 UTC m=+345.359306557" observedRunningTime="2026-03-10 00:12:23.257524618 +0000 UTC m=+357.431231377" watchObservedRunningTime="2026-03-10 00:13:17.691549919 +0000 UTC m=+411.865256698" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.693788 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s7qcn" podStartSLOduration=86.716270785 podStartE2EDuration="3m14.693775054s" podCreationTimestamp="2026-03-10 00:10:03 +0000 UTC" firstStartedPulling="2026-03-10 00:10:12.479856419 +0000 UTC m=+226.653563168" lastFinishedPulling="2026-03-10 00:12:00.457360648 +0000 UTC m=+334.631067437" observedRunningTime="2026-03-10 00:12:23.12237734 +0000 UTC m=+357.296084159" watchObservedRunningTime="2026-03-10 00:13:17.693775054 +0000 UTC m=+411.867481843" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.694076 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zv2kt" podStartSLOduration=84.258518061 podStartE2EDuration="3m15.694067971s" podCreationTimestamp="2026-03-10 00:10:02 +0000 UTC" firstStartedPulling="2026-03-10 00:10:12.477159162 +0000 UTC m=+226.650865911" lastFinishedPulling="2026-03-10 00:12:03.912709072 +0000 UTC m=+338.086415821" observedRunningTime="2026-03-10 00:12:23.290094353 +0000 UTC m=+357.463801102" watchObservedRunningTime="2026-03-10 00:13:17.694067971 +0000 UTC m=+411.867774770" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.696137 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t5kj4" podStartSLOduration=71.667067255 podStartE2EDuration="3m11.696126513s" podCreationTimestamp="2026-03-10 00:10:06 +0000 UTC" firstStartedPulling="2026-03-10 00:10:12.468096984 +0000 UTC m=+226.641803743" lastFinishedPulling="2026-03-10 00:12:12.497156242 +0000 UTC m=+346.670863001" observedRunningTime="2026-03-10 00:12:23.144680031 +0000 UTC m=+357.318386780" watchObservedRunningTime="2026-03-10 00:13:17.696126513 +0000 UTC m=+411.869833292" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.696410 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bwzk5" podStartSLOduration=77.00224967 podStartE2EDuration="3m15.69640182s" podCreationTimestamp="2026-03-10 00:10:02 +0000 UTC" firstStartedPulling="2026-03-10 00:10:12.492788593 +0000 UTC m=+226.666495342" lastFinishedPulling="2026-03-10 00:12:11.186940703 +0000 UTC m=+345.360647492" observedRunningTime="2026-03-10 00:12:23.170935795 +0000 UTC m=+357.344642554" watchObservedRunningTime="2026-03-10 00:13:17.69640182 +0000 UTC m=+411.870108609" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.697143 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hrh9x" podStartSLOduration=83.340264518 podStartE2EDuration="3m13.697134298s" podCreationTimestamp="2026-03-10 00:10:04 +0000 UTC" firstStartedPulling="2026-03-10 00:10:12.436264708 +0000 UTC m=+226.609971447" lastFinishedPulling="2026-03-10 00:12:02.793134428 +0000 UTC m=+336.966841227" observedRunningTime="2026-03-10 00:12:23.230209937 +0000 UTC m=+357.403916696" watchObservedRunningTime="2026-03-10 00:13:17.697134298 +0000 UTC m=+411.870841077" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.699962 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.700038 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-infra/auto-csr-approver-29551692-29hls"] Mar 10 00:13:17 crc kubenswrapper[4994]: E0310 00:13:17.700338 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" containerName="installer" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.700357 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" containerName="installer" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.700546 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7e1505e-1226-47f8-8e43-6ac30a4ff867" containerName="installer" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.701218 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551692-29hls" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.703915 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.704341 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.704637 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.709747 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.725569 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnm5w\" (UniqueName: \"kubernetes.io/projected/24a70a0f-0e78-4f55-9eee-62099acf734d-kube-api-access-dnm5w\") pod \"auto-csr-approver-29551692-29hls\" (UID: \"24a70a0f-0e78-4f55-9eee-62099acf734d\") " pod="openshift-infra/auto-csr-approver-29551692-29hls" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.730673 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=54.73064553 podStartE2EDuration="54.73064553s" podCreationTimestamp="2026-03-10 00:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:13:17.727479461 +0000 UTC m=+411.901186240" watchObservedRunningTime="2026-03-10 00:13:17.73064553 +0000 UTC m=+411.904352279" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.748490 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=22.748465932 podStartE2EDuration="22.748465932s" podCreationTimestamp="2026-03-10 00:12:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:13:17.74840647 +0000 UTC m=+411.922113229" watchObservedRunningTime="2026-03-10 00:13:17.748465932 +0000 UTC m=+411.922172701" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.829552 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnm5w\" (UniqueName: \"kubernetes.io/projected/24a70a0f-0e78-4f55-9eee-62099acf734d-kube-api-access-dnm5w\") pod \"auto-csr-approver-29551692-29hls\" (UID: \"24a70a0f-0e78-4f55-9eee-62099acf734d\") " pod="openshift-infra/auto-csr-approver-29551692-29hls" Mar 10 00:13:17 crc kubenswrapper[4994]: I0310 00:13:17.859986 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnm5w\" (UniqueName: \"kubernetes.io/projected/24a70a0f-0e78-4f55-9eee-62099acf734d-kube-api-access-dnm5w\") pod \"auto-csr-approver-29551692-29hls\" (UID: \"24a70a0f-0e78-4f55-9eee-62099acf734d\") " pod="openshift-infra/auto-csr-approver-29551692-29hls" Mar 10 00:13:18 crc kubenswrapper[4994]: I0310 00:13:18.024678 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551692-29hls" Mar 10 00:13:18 crc kubenswrapper[4994]: I0310 00:13:18.064103 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 00:13:18 crc kubenswrapper[4994]: I0310 00:13:18.211140 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 00:13:18 crc kubenswrapper[4994]: I0310 00:13:18.244414 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 00:13:18 crc kubenswrapper[4994]: I0310 00:13:18.456472 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551692-29hls"] Mar 10 00:13:18 crc kubenswrapper[4994]: W0310 00:13:18.468152 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24a70a0f_0e78_4f55_9eee_62099acf734d.slice/crio-87f833a4ec9d30c9292ef3b7519a0b401b021a95f8b7dbbbbcf037633a46166b WatchSource:0}: Error finding container 87f833a4ec9d30c9292ef3b7519a0b401b021a95f8b7dbbbbcf037633a46166b: Status 404 returned error can't find the container with id 87f833a4ec9d30c9292ef3b7519a0b401b021a95f8b7dbbbbcf037633a46166b Mar 10 00:13:19 crc kubenswrapper[4994]: I0310 00:13:19.260147 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 00:13:19 crc kubenswrapper[4994]: I0310 00:13:19.303154 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551692-29hls" event={"ID":"24a70a0f-0e78-4f55-9eee-62099acf734d","Type":"ContainerStarted","Data":"87f833a4ec9d30c9292ef3b7519a0b401b021a95f8b7dbbbbcf037633a46166b"} Mar 10 00:13:19 crc kubenswrapper[4994]: I0310 00:13:19.335761 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 00:13:19 crc kubenswrapper[4994]: I0310 00:13:19.660932 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 00:13:19 crc kubenswrapper[4994]: I0310 00:13:19.759778 4994 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 00:13:19 crc kubenswrapper[4994]: I0310 00:13:19.760169 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f159c5160bbebc2d55cd9bb33ea390e800dbff7ea9620c436631139ca88c6b3b" gracePeriod=5 Mar 10 00:13:19 crc kubenswrapper[4994]: I0310 00:13:19.839637 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 00:13:20 crc kubenswrapper[4994]: I0310 00:13:20.082434 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 00:13:20 crc kubenswrapper[4994]: I0310 00:13:20.269450 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 00:13:20 crc kubenswrapper[4994]: I0310 00:13:20.313926 4994 generic.go:334] "Generic (PLEG): container finished" podID="24a70a0f-0e78-4f55-9eee-62099acf734d" containerID="ef1f80910f9e65f34790675bdb343fd6ffef0cbe9f22353df047c95ba63843ec" exitCode=0 Mar 10 00:13:20 crc kubenswrapper[4994]: I0310 00:13:20.314020 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551692-29hls" event={"ID":"24a70a0f-0e78-4f55-9eee-62099acf734d","Type":"ContainerDied","Data":"ef1f80910f9e65f34790675bdb343fd6ffef0cbe9f22353df047c95ba63843ec"} Mar 10 00:13:20 crc kubenswrapper[4994]: I0310 00:13:20.401099 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 00:13:20 crc kubenswrapper[4994]: I0310 00:13:20.854682 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.025544 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.077348 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.293460 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.425428 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.478182 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.513489 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.617055 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551692-29hls" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.638429 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.684750 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnm5w\" (UniqueName: \"kubernetes.io/projected/24a70a0f-0e78-4f55-9eee-62099acf734d-kube-api-access-dnm5w\") pod \"24a70a0f-0e78-4f55-9eee-62099acf734d\" (UID: \"24a70a0f-0e78-4f55-9eee-62099acf734d\") " Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.695256 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a70a0f-0e78-4f55-9eee-62099acf734d-kube-api-access-dnm5w" (OuterVolumeSpecName: "kube-api-access-dnm5w") pod "24a70a0f-0e78-4f55-9eee-62099acf734d" (UID: "24a70a0f-0e78-4f55-9eee-62099acf734d"). InnerVolumeSpecName "kube-api-access-dnm5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.778967 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 00:13:21 crc kubenswrapper[4994]: I0310 00:13:21.792250 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnm5w\" (UniqueName: \"kubernetes.io/projected/24a70a0f-0e78-4f55-9eee-62099acf734d-kube-api-access-dnm5w\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:22 crc kubenswrapper[4994]: I0310 00:13:22.060248 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:13:22 crc kubenswrapper[4994]: I0310 00:13:22.060314 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:13:22 crc kubenswrapper[4994]: I0310 00:13:22.060909 4994 scope.go:117] "RemoveContainer" containerID="e7fca198849ed5918e32d90439680c268c9d2e25eff70cc1e2ce92503bc67d85" Mar 10 00:13:22 crc kubenswrapper[4994]: E0310 00:13:22.061409 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-tgf68_openshift-marketplace(b85bbdaa-daa8-4c69-abf9-9f1200eb07cd)\"" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" Mar 10 00:13:22 crc kubenswrapper[4994]: I0310 00:13:22.295014 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 00:13:22 crc kubenswrapper[4994]: I0310 00:13:22.333158 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551692-29hls" event={"ID":"24a70a0f-0e78-4f55-9eee-62099acf734d","Type":"ContainerDied","Data":"87f833a4ec9d30c9292ef3b7519a0b401b021a95f8b7dbbbbcf037633a46166b"} Mar 10 00:13:22 crc kubenswrapper[4994]: I0310 00:13:22.333467 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87f833a4ec9d30c9292ef3b7519a0b401b021a95f8b7dbbbbcf037633a46166b" Mar 10 00:13:22 crc kubenswrapper[4994]: I0310 00:13:22.333230 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551692-29hls" Mar 10 00:13:23 crc kubenswrapper[4994]: I0310 00:13:23.744315 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 00:13:23 crc kubenswrapper[4994]: I0310 00:13:23.828161 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 00:13:24 crc kubenswrapper[4994]: I0310 00:13:24.065853 4994 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.018546 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.306216 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.359828 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.359929 4994 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f159c5160bbebc2d55cd9bb33ea390e800dbff7ea9620c436631139ca88c6b3b" exitCode=137 Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.359980 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18a5e48ee962680b8c06a3e5a5396e0f805c9148a02be8cc74fe52d50407bdb6" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.378829 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.378928 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.458639 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.458701 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.458784 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.458844 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.458904 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.458967 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.458864 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.459146 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.459253 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.459467 4994 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.459489 4994 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.459508 4994 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.459523 4994 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.470072 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:13:25 crc kubenswrapper[4994]: I0310 00:13:25.561000 4994 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:26 crc kubenswrapper[4994]: I0310 00:13:26.151134 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 00:13:26 crc kubenswrapper[4994]: I0310 00:13:26.367061 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:13:26 crc kubenswrapper[4994]: I0310 00:13:26.569651 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 10 00:13:26 crc kubenswrapper[4994]: I0310 00:13:26.570169 4994 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 10 00:13:26 crc kubenswrapper[4994]: I0310 00:13:26.585805 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 00:13:26 crc kubenswrapper[4994]: I0310 00:13:26.585854 4994 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="a7d7465c-7086-43ed-96f0-3fed2dc918a1" Mar 10 00:13:26 crc kubenswrapper[4994]: I0310 00:13:26.593055 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 00:13:26 crc kubenswrapper[4994]: I0310 00:13:26.593110 4994 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="a7d7465c-7086-43ed-96f0-3fed2dc918a1" Mar 10 00:13:26 crc kubenswrapper[4994]: I0310 00:13:26.641694 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 00:13:27 crc kubenswrapper[4994]: I0310 00:13:27.380805 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 00:13:27 crc kubenswrapper[4994]: I0310 00:13:27.556845 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 00:13:33 crc kubenswrapper[4994]: I0310 00:13:33.555034 4994 scope.go:117] "RemoveContainer" containerID="e7fca198849ed5918e32d90439680c268c9d2e25eff70cc1e2ce92503bc67d85" Mar 10 00:13:33 crc kubenswrapper[4994]: E0310 00:13:33.556345 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-tgf68_openshift-marketplace(b85bbdaa-daa8-4c69-abf9-9f1200eb07cd)\"" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.008496 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bwzk5"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.009388 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bwzk5" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" containerName="registry-server" containerID="cri-o://8050e9fca1f15bb15fdfd5da3939a33eede922d4e417035502ed98660f52b965" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.025589 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c4tz9"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.025913 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c4tz9" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerName="registry-server" containerID="cri-o://c85cb3939718589608e5c84bc5f793f0ac91554e53660eb93bb8216dc7e11be6" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.034457 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s7qcn"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.034727 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s7qcn" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" containerName="registry-server" containerID="cri-o://06150f7e8e87fb17f0464494725fda3c7ecd07c48589662349d651d3700f6139" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.040179 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zv2kt"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.040387 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zv2kt" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerName="registry-server" containerID="cri-o://ed0179eebea68ea16a7ca2db77939a49213c1a39d735deda526e14363858855f" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.048345 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tgf68"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.058041 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzrd2"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.058677 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bzrd2" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerName="registry-server" containerID="cri-o://4ac24c0285d02489c6991b060b56e4de350d7660656ffb13b6c269d046020cdf" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.069586 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrh9x"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.069970 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hrh9x" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerName="registry-server" containerID="cri-o://0872f65b4051bd51ddef4f02f90e17a29e60ddaf92da0bcd501b161644707129" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.076577 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7987b4b568-rd8h8"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.093964 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ppfwk"] Mar 10 00:13:41 crc kubenswrapper[4994]: E0310 00:13:41.094241 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.094255 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 00:13:41 crc kubenswrapper[4994]: E0310 00:13:41.094278 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a70a0f-0e78-4f55-9eee-62099acf734d" containerName="oc" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.094288 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a70a0f-0e78-4f55-9eee-62099acf734d" containerName="oc" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.094420 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.094440 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a70a0f-0e78-4f55-9eee-62099acf734d" containerName="oc" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.094996 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.103729 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t5kj4"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.104009 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t5kj4" podUID="0429fae4-1356-4d61-86a3-267f74f27636" containerName="registry-server" containerID="cri-o://ae53ee029113a53cb7678f130696b300909ff84d74bb704709a978c1c97a24b4" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.110590 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wpd8k"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.110831 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wpd8k" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="registry-server" containerID="cri-o://68b888f4cbe6324465ef819f044ee90195b5c11bf20ab3d53a6faef3352a36c6" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.146864 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ppfwk"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.166286 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg"] Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.166533 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" podUID="3653335d-178c-4df8-a93d-4d19011298fe" containerName="route-controller-manager" containerID="cri-o://fb93dc31c2039db01d5ef603fb8cadf69dce66a5580028d2b9b19cc49d45e88a" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: E0310 00:13:41.179499 4994 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4a4dc2d_502f_4c05_ab76_1cc708f13006.slice/crio-0872f65b4051bd51ddef4f02f90e17a29e60ddaf92da0bcd501b161644707129.scope\": RecentStats: unable to find data in memory cache]" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.202340 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46c4619e-ab9f-4fd9-9f3e-5b7ba9415823-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ppfwk\" (UID: \"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823\") " pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.202380 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcft9\" (UniqueName: \"kubernetes.io/projected/46c4619e-ab9f-4fd9-9f3e-5b7ba9415823-kube-api-access-vcft9\") pod \"marketplace-operator-79b997595-ppfwk\" (UID: \"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823\") " pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.202434 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/46c4619e-ab9f-4fd9-9f3e-5b7ba9415823-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ppfwk\" (UID: \"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823\") " pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.303012 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46c4619e-ab9f-4fd9-9f3e-5b7ba9415823-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ppfwk\" (UID: \"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823\") " pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.303060 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcft9\" (UniqueName: \"kubernetes.io/projected/46c4619e-ab9f-4fd9-9f3e-5b7ba9415823-kube-api-access-vcft9\") pod \"marketplace-operator-79b997595-ppfwk\" (UID: \"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823\") " pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.303115 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/46c4619e-ab9f-4fd9-9f3e-5b7ba9415823-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ppfwk\" (UID: \"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823\") " pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.304799 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46c4619e-ab9f-4fd9-9f3e-5b7ba9415823-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ppfwk\" (UID: \"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823\") " pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.315725 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/46c4619e-ab9f-4fd9-9f3e-5b7ba9415823-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ppfwk\" (UID: \"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823\") " pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.317792 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcft9\" (UniqueName: \"kubernetes.io/projected/46c4619e-ab9f-4fd9-9f3e-5b7ba9415823-kube-api-access-vcft9\") pod \"marketplace-operator-79b997595-ppfwk\" (UID: \"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823\") " pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.484461 4994 generic.go:334] "Generic (PLEG): container finished" podID="6525b40b-1c23-4533-a025-4d86bc406f00" containerID="68b888f4cbe6324465ef819f044ee90195b5c11bf20ab3d53a6faef3352a36c6" exitCode=0 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.484823 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpd8k" event={"ID":"6525b40b-1c23-4533-a025-4d86bc406f00","Type":"ContainerDied","Data":"68b888f4cbe6324465ef819f044ee90195b5c11bf20ab3d53a6faef3352a36c6"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.502787 4994 generic.go:334] "Generic (PLEG): container finished" podID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerID="c85cb3939718589608e5c84bc5f793f0ac91554e53660eb93bb8216dc7e11be6" exitCode=0 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.502828 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4tz9" event={"ID":"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5","Type":"ContainerDied","Data":"c85cb3939718589608e5c84bc5f793f0ac91554e53660eb93bb8216dc7e11be6"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.508743 4994 generic.go:334] "Generic (PLEG): container finished" podID="0429fae4-1356-4d61-86a3-267f74f27636" containerID="ae53ee029113a53cb7678f130696b300909ff84d74bb704709a978c1c97a24b4" exitCode=0 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.508794 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5kj4" event={"ID":"0429fae4-1356-4d61-86a3-267f74f27636","Type":"ContainerDied","Data":"ae53ee029113a53cb7678f130696b300909ff84d74bb704709a978c1c97a24b4"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.514518 4994 generic.go:334] "Generic (PLEG): container finished" podID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerID="0872f65b4051bd51ddef4f02f90e17a29e60ddaf92da0bcd501b161644707129" exitCode=0 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.514566 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrh9x" event={"ID":"a4a4dc2d-502f-4c05-ab76-1cc708f13006","Type":"ContainerDied","Data":"0872f65b4051bd51ddef4f02f90e17a29e60ddaf92da0bcd501b161644707129"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.517290 4994 generic.go:334] "Generic (PLEG): container finished" podID="fdad0261-804d-41dc-8a25-48018f136c0f" containerID="8050e9fca1f15bb15fdfd5da3939a33eede922d4e417035502ed98660f52b965" exitCode=0 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.517385 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwzk5" event={"ID":"fdad0261-804d-41dc-8a25-48018f136c0f","Type":"ContainerDied","Data":"8050e9fca1f15bb15fdfd5da3939a33eede922d4e417035502ed98660f52b965"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.520080 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-tgf68_b85bbdaa-daa8-4c69-abf9-9f1200eb07cd/marketplace-operator/2.log" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.520190 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" event={"ID":"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd","Type":"ContainerDied","Data":"18370ebb096ba95df384c4822d63f9eeb86d553280650a21652dc981b373eee5"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.520320 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18370ebb096ba95df384c4822d63f9eeb86d553280650a21652dc981b373eee5" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.524824 4994 generic.go:334] "Generic (PLEG): container finished" podID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerID="ed0179eebea68ea16a7ca2db77939a49213c1a39d735deda526e14363858855f" exitCode=0 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.524886 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zv2kt" event={"ID":"76aa065c-ed60-4237-b36f-5ce2865256ff","Type":"ContainerDied","Data":"ed0179eebea68ea16a7ca2db77939a49213c1a39d735deda526e14363858855f"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.529278 4994 generic.go:334] "Generic (PLEG): container finished" podID="3653335d-178c-4df8-a93d-4d19011298fe" containerID="fb93dc31c2039db01d5ef603fb8cadf69dce66a5580028d2b9b19cc49d45e88a" exitCode=0 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.529323 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" event={"ID":"3653335d-178c-4df8-a93d-4d19011298fe","Type":"ContainerDied","Data":"fb93dc31c2039db01d5ef603fb8cadf69dce66a5580028d2b9b19cc49d45e88a"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.533882 4994 generic.go:334] "Generic (PLEG): container finished" podID="abe30cce-8379-4db8-838b-f48b4bc96621" containerID="06150f7e8e87fb17f0464494725fda3c7ecd07c48589662349d651d3700f6139" exitCode=0 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.533926 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7qcn" event={"ID":"abe30cce-8379-4db8-838b-f48b4bc96621","Type":"ContainerDied","Data":"06150f7e8e87fb17f0464494725fda3c7ecd07c48589662349d651d3700f6139"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.536336 4994 generic.go:334] "Generic (PLEG): container finished" podID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerID="4ac24c0285d02489c6991b060b56e4de350d7660656ffb13b6c269d046020cdf" exitCode=0 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.536363 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzrd2" event={"ID":"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c","Type":"ContainerDied","Data":"4ac24c0285d02489c6991b060b56e4de350d7660656ffb13b6c269d046020cdf"} Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.536501 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" podUID="ae5ec419-c993-43ff-b664-703b8b5a3d5a" containerName="controller-manager" containerID="cri-o://436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514" gracePeriod=30 Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.727253 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.730796 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-tgf68_b85bbdaa-daa8-4c69-abf9-9f1200eb07cd/marketplace-operator/2.log" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.730862 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.736169 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.741335 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.750965 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.758036 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.763666 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.771700 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.780804 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.785570 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.792348 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.808945 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-catalog-content\") pod \"fdad0261-804d-41dc-8a25-48018f136c0f\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809024 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fbrm\" (UniqueName: \"kubernetes.io/projected/abe30cce-8379-4db8-838b-f48b4bc96621-kube-api-access-2fbrm\") pod \"abe30cce-8379-4db8-838b-f48b4bc96621\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809076 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-operator-metrics\") pod \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809110 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-utilities\") pod \"fdad0261-804d-41dc-8a25-48018f136c0f\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809431 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-utilities\") pod \"abe30cce-8379-4db8-838b-f48b4bc96621\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809466 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l95st\" (UniqueName: \"kubernetes.io/projected/fdad0261-804d-41dc-8a25-48018f136c0f-kube-api-access-l95st\") pod \"fdad0261-804d-41dc-8a25-48018f136c0f\" (UID: \"fdad0261-804d-41dc-8a25-48018f136c0f\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809491 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-catalog-content\") pod \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809634 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-trusted-ca\") pod \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809738 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-utilities\") pod \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809759 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-catalog-content\") pod \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809778 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-utilities\") pod \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809797 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-catalog-content\") pod \"abe30cce-8379-4db8-838b-f48b4bc96621\" (UID: \"abe30cce-8379-4db8-838b-f48b4bc96621\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809827 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdkv4\" (UniqueName: \"kubernetes.io/projected/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-kube-api-access-fdkv4\") pod \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\" (UID: \"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809846 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7278l\" (UniqueName: \"kubernetes.io/projected/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-kube-api-access-7278l\") pod \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\" (UID: \"b85bbdaa-daa8-4c69-abf9-9f1200eb07cd\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.809910 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfh8s\" (UniqueName: \"kubernetes.io/projected/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-kube-api-access-xfh8s\") pod \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\" (UID: \"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.811477 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-utilities" (OuterVolumeSpecName: "utilities") pod "64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" (UID: "64ec1b6f-2c0f-4cfc-be18-a2d311fae68c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.812422 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-utilities" (OuterVolumeSpecName: "utilities") pod "abe30cce-8379-4db8-838b-f48b4bc96621" (UID: "abe30cce-8379-4db8-838b-f48b4bc96621"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.813559 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-utilities" (OuterVolumeSpecName: "utilities") pod "ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" (UID: "ab6cd76f-6272-4fcd-8c75-3040c45ef1b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.815585 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-kube-api-access-fdkv4" (OuterVolumeSpecName: "kube-api-access-fdkv4") pod "64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" (UID: "64ec1b6f-2c0f-4cfc-be18-a2d311fae68c"). InnerVolumeSpecName "kube-api-access-fdkv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.819387 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" (UID: "b85bbdaa-daa8-4c69-abf9-9f1200eb07cd"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.821360 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-kube-api-access-xfh8s" (OuterVolumeSpecName: "kube-api-access-xfh8s") pod "ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" (UID: "ab6cd76f-6272-4fcd-8c75-3040c45ef1b5"). InnerVolumeSpecName "kube-api-access-xfh8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.826427 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" (UID: "b85bbdaa-daa8-4c69-abf9-9f1200eb07cd"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.829332 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-kube-api-access-7278l" (OuterVolumeSpecName: "kube-api-access-7278l") pod "b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" (UID: "b85bbdaa-daa8-4c69-abf9-9f1200eb07cd"). InnerVolumeSpecName "kube-api-access-7278l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.840088 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdad0261-804d-41dc-8a25-48018f136c0f-kube-api-access-l95st" (OuterVolumeSpecName: "kube-api-access-l95st") pod "fdad0261-804d-41dc-8a25-48018f136c0f" (UID: "fdad0261-804d-41dc-8a25-48018f136c0f"). InnerVolumeSpecName "kube-api-access-l95st". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.858053 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abe30cce-8379-4db8-838b-f48b4bc96621-kube-api-access-2fbrm" (OuterVolumeSpecName: "kube-api-access-2fbrm") pod "abe30cce-8379-4db8-838b-f48b4bc96621" (UID: "abe30cce-8379-4db8-838b-f48b4bc96621"). InnerVolumeSpecName "kube-api-access-2fbrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.859099 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-utilities" (OuterVolumeSpecName: "utilities") pod "fdad0261-804d-41dc-8a25-48018f136c0f" (UID: "fdad0261-804d-41dc-8a25-48018f136c0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.859154 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" (UID: "64ec1b6f-2c0f-4cfc-be18-a2d311fae68c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.875312 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abe30cce-8379-4db8-838b-f48b4bc96621" (UID: "abe30cce-8379-4db8-838b-f48b4bc96621"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.879217 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912553 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-catalog-content\") pod \"6525b40b-1c23-4533-a025-4d86bc406f00\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912604 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frwzt\" (UniqueName: \"kubernetes.io/projected/0429fae4-1356-4d61-86a3-267f74f27636-kube-api-access-frwzt\") pod \"0429fae4-1356-4d61-86a3-267f74f27636\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912645 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-config\") pod \"3653335d-178c-4df8-a93d-4d19011298fe\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912677 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-utilities\") pod \"0429fae4-1356-4d61-86a3-267f74f27636\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912707 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-utilities\") pod \"76aa065c-ed60-4237-b36f-5ce2865256ff\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912751 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-catalog-content\") pod \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912771 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-utilities\") pod \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912805 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3653335d-178c-4df8-a93d-4d19011298fe-serving-cert\") pod \"3653335d-178c-4df8-a93d-4d19011298fe\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912835 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-utilities\") pod \"6525b40b-1c23-4533-a025-4d86bc406f00\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912860 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btt9c\" (UniqueName: \"kubernetes.io/projected/a4a4dc2d-502f-4c05-ab76-1cc708f13006-kube-api-access-btt9c\") pod \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\" (UID: \"a4a4dc2d-502f-4c05-ab76-1cc708f13006\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912898 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8zbr\" (UniqueName: \"kubernetes.io/projected/3653335d-178c-4df8-a93d-4d19011298fe-kube-api-access-f8zbr\") pod \"3653335d-178c-4df8-a93d-4d19011298fe\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912922 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-catalog-content\") pod \"0429fae4-1356-4d61-86a3-267f74f27636\" (UID: \"0429fae4-1356-4d61-86a3-267f74f27636\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912946 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-client-ca\") pod \"3653335d-178c-4df8-a93d-4d19011298fe\" (UID: \"3653335d-178c-4df8-a93d-4d19011298fe\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.912971 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-catalog-content\") pod \"76aa065c-ed60-4237-b36f-5ce2865256ff\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.913010 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lx2s\" (UniqueName: \"kubernetes.io/projected/76aa065c-ed60-4237-b36f-5ce2865256ff-kube-api-access-5lx2s\") pod \"76aa065c-ed60-4237-b36f-5ce2865256ff\" (UID: \"76aa065c-ed60-4237-b36f-5ce2865256ff\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.914584 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-utilities" (OuterVolumeSpecName: "utilities") pod "0429fae4-1356-4d61-86a3-267f74f27636" (UID: "0429fae4-1356-4d61-86a3-267f74f27636"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.914840 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-utilities" (OuterVolumeSpecName: "utilities") pod "76aa065c-ed60-4237-b36f-5ce2865256ff" (UID: "76aa065c-ed60-4237-b36f-5ce2865256ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.915342 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-utilities" (OuterVolumeSpecName: "utilities") pod "a4a4dc2d-502f-4c05-ab76-1cc708f13006" (UID: "a4a4dc2d-502f-4c05-ab76-1cc708f13006"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.915518 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpl2h\" (UniqueName: \"kubernetes.io/projected/6525b40b-1c23-4533-a025-4d86bc406f00-kube-api-access-xpl2h\") pod \"6525b40b-1c23-4533-a025-4d86bc406f00\" (UID: \"6525b40b-1c23-4533-a025-4d86bc406f00\") " Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.916527 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-utilities" (OuterVolumeSpecName: "utilities") pod "6525b40b-1c23-4533-a025-4d86bc406f00" (UID: "6525b40b-1c23-4533-a025-4d86bc406f00"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918627 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918646 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfh8s\" (UniqueName: \"kubernetes.io/projected/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-kube-api-access-xfh8s\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918664 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fbrm\" (UniqueName: \"kubernetes.io/projected/abe30cce-8379-4db8-838b-f48b4bc96621-kube-api-access-2fbrm\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918675 4994 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918686 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918697 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918707 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l95st\" (UniqueName: \"kubernetes.io/projected/fdad0261-804d-41dc-8a25-48018f136c0f-kube-api-access-l95st\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918717 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918726 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918738 4994 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918747 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918756 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918768 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918780 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe30cce-8379-4db8-838b-f48b4bc96621-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918855 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918866 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdkv4\" (UniqueName: \"kubernetes.io/projected/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c-kube-api-access-fdkv4\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.918949 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7278l\" (UniqueName: \"kubernetes.io/projected/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd-kube-api-access-7278l\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.922797 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0429fae4-1356-4d61-86a3-267f74f27636-kube-api-access-frwzt" (OuterVolumeSpecName: "kube-api-access-frwzt") pod "0429fae4-1356-4d61-86a3-267f74f27636" (UID: "0429fae4-1356-4d61-86a3-267f74f27636"). InnerVolumeSpecName "kube-api-access-frwzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.925669 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-config" (OuterVolumeSpecName: "config") pod "3653335d-178c-4df8-a93d-4d19011298fe" (UID: "3653335d-178c-4df8-a93d-4d19011298fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.925935 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a4dc2d-502f-4c05-ab76-1cc708f13006-kube-api-access-btt9c" (OuterVolumeSpecName: "kube-api-access-btt9c") pod "a4a4dc2d-502f-4c05-ab76-1cc708f13006" (UID: "a4a4dc2d-502f-4c05-ab76-1cc708f13006"). InnerVolumeSpecName "kube-api-access-btt9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.926267 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-client-ca" (OuterVolumeSpecName: "client-ca") pod "3653335d-178c-4df8-a93d-4d19011298fe" (UID: "3653335d-178c-4df8-a93d-4d19011298fe"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.927987 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3653335d-178c-4df8-a93d-4d19011298fe-kube-api-access-f8zbr" (OuterVolumeSpecName: "kube-api-access-f8zbr") pod "3653335d-178c-4df8-a93d-4d19011298fe" (UID: "3653335d-178c-4df8-a93d-4d19011298fe"). InnerVolumeSpecName "kube-api-access-f8zbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.929265 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6525b40b-1c23-4533-a025-4d86bc406f00-kube-api-access-xpl2h" (OuterVolumeSpecName: "kube-api-access-xpl2h") pod "6525b40b-1c23-4533-a025-4d86bc406f00" (UID: "6525b40b-1c23-4533-a025-4d86bc406f00"). InnerVolumeSpecName "kube-api-access-xpl2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.937986 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76aa065c-ed60-4237-b36f-5ce2865256ff-kube-api-access-5lx2s" (OuterVolumeSpecName: "kube-api-access-5lx2s") pod "76aa065c-ed60-4237-b36f-5ce2865256ff" (UID: "76aa065c-ed60-4237-b36f-5ce2865256ff"). InnerVolumeSpecName "kube-api-access-5lx2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.938426 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3653335d-178c-4df8-a93d-4d19011298fe-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3653335d-178c-4df8-a93d-4d19011298fe" (UID: "3653335d-178c-4df8-a93d-4d19011298fe"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.969535 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" (UID: "ab6cd76f-6272-4fcd-8c75-3040c45ef1b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.971131 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4a4dc2d-502f-4c05-ab76-1cc708f13006" (UID: "a4a4dc2d-502f-4c05-ab76-1cc708f13006"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.983849 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76aa065c-ed60-4237-b36f-5ce2865256ff" (UID: "76aa065c-ed60-4237-b36f-5ce2865256ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:41 crc kubenswrapper[4994]: I0310 00:13:41.985633 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdad0261-804d-41dc-8a25-48018f136c0f" (UID: "fdad0261-804d-41dc-8a25-48018f136c0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019359 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-config\") pod \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019444 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-client-ca\") pod \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019476 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae5ec419-c993-43ff-b664-703b8b5a3d5a-serving-cert\") pod \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019495 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-proxy-ca-bundles\") pod \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019542 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9lft\" (UniqueName: \"kubernetes.io/projected/ae5ec419-c993-43ff-b664-703b8b5a3d5a-kube-api-access-p9lft\") pod \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\" (UID: \"ae5ec419-c993-43ff-b664-703b8b5a3d5a\") " Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019751 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frwzt\" (UniqueName: \"kubernetes.io/projected/0429fae4-1356-4d61-86a3-267f74f27636-kube-api-access-frwzt\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019764 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019774 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019783 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a4dc2d-502f-4c05-ab76-1cc708f13006-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019791 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3653335d-178c-4df8-a93d-4d19011298fe-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019799 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btt9c\" (UniqueName: \"kubernetes.io/projected/a4a4dc2d-502f-4c05-ab76-1cc708f13006-kube-api-access-btt9c\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019807 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8zbr\" (UniqueName: \"kubernetes.io/projected/3653335d-178c-4df8-a93d-4d19011298fe-kube-api-access-f8zbr\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019815 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3653335d-178c-4df8-a93d-4d19011298fe-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019823 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76aa065c-ed60-4237-b36f-5ce2865256ff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019831 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdad0261-804d-41dc-8a25-48018f136c0f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019839 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lx2s\" (UniqueName: \"kubernetes.io/projected/76aa065c-ed60-4237-b36f-5ce2865256ff-kube-api-access-5lx2s\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.019848 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpl2h\" (UniqueName: \"kubernetes.io/projected/6525b40b-1c23-4533-a025-4d86bc406f00-kube-api-access-xpl2h\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.020212 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-client-ca" (OuterVolumeSpecName: "client-ca") pod "ae5ec419-c993-43ff-b664-703b8b5a3d5a" (UID: "ae5ec419-c993-43ff-b664-703b8b5a3d5a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.020292 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-config" (OuterVolumeSpecName: "config") pod "ae5ec419-c993-43ff-b664-703b8b5a3d5a" (UID: "ae5ec419-c993-43ff-b664-703b8b5a3d5a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.020524 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ae5ec419-c993-43ff-b664-703b8b5a3d5a" (UID: "ae5ec419-c993-43ff-b664-703b8b5a3d5a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.023592 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae5ec419-c993-43ff-b664-703b8b5a3d5a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ae5ec419-c993-43ff-b664-703b8b5a3d5a" (UID: "ae5ec419-c993-43ff-b664-703b8b5a3d5a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.025069 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae5ec419-c993-43ff-b664-703b8b5a3d5a-kube-api-access-p9lft" (OuterVolumeSpecName: "kube-api-access-p9lft") pod "ae5ec419-c993-43ff-b664-703b8b5a3d5a" (UID: "ae5ec419-c993-43ff-b664-703b8b5a3d5a"). InnerVolumeSpecName "kube-api-access-p9lft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.069285 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0429fae4-1356-4d61-86a3-267f74f27636" (UID: "0429fae4-1356-4d61-86a3-267f74f27636"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.074323 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6525b40b-1c23-4533-a025-4d86bc406f00" (UID: "6525b40b-1c23-4533-a025-4d86bc406f00"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.120491 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.120522 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae5ec419-c993-43ff-b664-703b8b5a3d5a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.120530 4994 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.120541 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0429fae4-1356-4d61-86a3-267f74f27636-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.120551 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9lft\" (UniqueName: \"kubernetes.io/projected/ae5ec419-c993-43ff-b664-703b8b5a3d5a-kube-api-access-p9lft\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.120561 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae5ec419-c993-43ff-b664-703b8b5a3d5a-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.120569 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6525b40b-1c23-4533-a025-4d86bc406f00-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.145328 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ppfwk"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.528959 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56bcc8b679-5m2f4"] Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529697 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529719 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529730 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529740 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529753 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529761 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529771 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529782 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529794 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529802 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529812 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3653335d-178c-4df8-a93d-4d19011298fe" containerName="route-controller-manager" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529819 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="3653335d-178c-4df8-a93d-4d19011298fe" containerName="route-controller-manager" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529832 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529839 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529849 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0429fae4-1356-4d61-86a3-267f74f27636" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529857 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="0429fae4-1356-4d61-86a3-267f74f27636" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529896 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5ec419-c993-43ff-b664-703b8b5a3d5a" containerName="controller-manager" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529907 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5ec419-c993-43ff-b664-703b8b5a3d5a" containerName="controller-manager" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529922 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529931 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529940 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0429fae4-1356-4d61-86a3-267f74f27636" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529948 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="0429fae4-1356-4d61-86a3-267f74f27636" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529959 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529968 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529980 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.529990 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.529999 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530007 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530017 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530024 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530041 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530049 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530059 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530067 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530078 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530086 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530123 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530131 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530143 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0429fae4-1356-4d61-86a3-267f74f27636" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530152 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="0429fae4-1356-4d61-86a3-267f74f27636" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530164 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530173 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530188 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530198 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530211 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530222 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530235 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530242 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerName="extract-content" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530253 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530261 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530270 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530277 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530290 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530297 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530307 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530314 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.530325 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530332 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="extract-utilities" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530459 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530473 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530482 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530493 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="0429fae4-1356-4d61-86a3-267f74f27636" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530503 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" containerName="marketplace-operator" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530515 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae5ec419-c993-43ff-b664-703b8b5a3d5a" containerName="controller-manager" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530525 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530537 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530548 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530558 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530570 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530582 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" containerName="registry-server" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.530592 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="3653335d-178c-4df8-a93d-4d19011298fe" containerName="route-controller-manager" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.531158 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.531931 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.533366 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.544435 4994 generic.go:334] "Generic (PLEG): container finished" podID="ae5ec419-c993-43ff-b664-703b8b5a3d5a" containerID="436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514" exitCode=0 Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.544526 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.544555 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" event={"ID":"ae5ec419-c993-43ff-b664-703b8b5a3d5a","Type":"ContainerDied","Data":"436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.544597 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7987b4b568-rd8h8" event={"ID":"ae5ec419-c993-43ff-b664-703b8b5a3d5a","Type":"ContainerDied","Data":"83f86e0cd6062d10a147cbaca0b7678a7e90a57f0aee2f43c47901ded05219f4"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.544625 4994 scope.go:117] "RemoveContainer" containerID="436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.548006 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzrd2" event={"ID":"64ec1b6f-2c0f-4cfc-be18-a2d311fae68c","Type":"ContainerDied","Data":"d1b32d28a2daabcbb6951ddc2404e012b74605f090a8de0ccde979112a9da8a3"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.548189 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bzrd2" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.553478 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" event={"ID":"3653335d-178c-4df8-a93d-4d19011298fe","Type":"ContainerDied","Data":"86f6391bf2290461b2367cbde0871fb0815774ff3d6099505b8889c9a6ec884a"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.553655 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.561773 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wpd8k" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.572757 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwzk5" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.574141 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56bcc8b679-5m2f4"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.574194 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.574217 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpd8k" event={"ID":"6525b40b-1c23-4533-a025-4d86bc406f00","Type":"ContainerDied","Data":"45f9adf9166a73c98a80bad9d037f2560042ccbdeec2aa18a7cbfa8528d64c9a"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.574251 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwzk5" event={"ID":"fdad0261-804d-41dc-8a25-48018f136c0f","Type":"ContainerDied","Data":"97695118e06c708e4423796822ac54cea80fe3fc3b7289e71d5f6ac300dfeb72"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.579790 4994 scope.go:117] "RemoveContainer" containerID="436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514" Mar 10 00:13:42 crc kubenswrapper[4994]: E0310 00:13:42.580283 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514\": container with ID starting with 436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514 not found: ID does not exist" containerID="436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.580332 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514"} err="failed to get container status \"436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514\": rpc error: code = NotFound desc = could not find container \"436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514\": container with ID starting with 436cd2f84bf0bcf36d5993ae27c83be187ae6b4e35d5668fcc5ab61361b7f514 not found: ID does not exist" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.580366 4994 scope.go:117] "RemoveContainer" containerID="4ac24c0285d02489c6991b060b56e4de350d7660656ffb13b6c269d046020cdf" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.583772 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c4tz9" event={"ID":"ab6cd76f-6272-4fcd-8c75-3040c45ef1b5","Type":"ContainerDied","Data":"7734e5951b02bb3a0a46ea5a16ee396269ee4c95d8725567063c864699e319c0"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.583890 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c4tz9" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.588316 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" event={"ID":"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823","Type":"ContainerStarted","Data":"0fec64e6185812db73ed584e560e0a50856bffe1c60d52e399b5cbca436dfdc9"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.598270 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.599026 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" event={"ID":"46c4619e-ab9f-4fd9-9f3e-5b7ba9415823","Type":"ContainerStarted","Data":"f95cfb0b7116267392dd4c5e2d33e1273f81a2beef99d98c4632979e5c8bb3dc"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.599659 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrh9x" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.600023 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5kj4" event={"ID":"0429fae4-1356-4d61-86a3-267f74f27636","Type":"ContainerDied","Data":"c31a76fcba6e0a2edf574624c6292a3f51560169f1f1fa309a2ea336e40231d4"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.600502 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zv2kt" event={"ID":"76aa065c-ed60-4237-b36f-5ce2865256ff","Type":"ContainerDied","Data":"3e81512696f04f227cf371ddcf1556e047699059d617fb7f43f9cba658930f7f"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.600548 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrh9x" event={"ID":"a4a4dc2d-502f-4c05-ab76-1cc708f13006","Type":"ContainerDied","Data":"5863f96f41db6fc401ecb7000e3f5a6cfef96961acdd0f3f461004c58668116e"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.590053 4994 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ppfwk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" start-of-body= Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.600658 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" podUID="46c4619e-ab9f-4fd9-9f3e-5b7ba9415823" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.592226 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5kj4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.597181 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zv2kt" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.602685 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tgf68" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.602932 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7qcn" event={"ID":"abe30cce-8379-4db8-838b-f48b4bc96621","Type":"ContainerDied","Data":"a14572119f65e2d0fbfc63101455582a2b9abfe6948028817dc155c8d3a9c7ab"} Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.603172 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7qcn" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.628144 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w5zm\" (UniqueName: \"kubernetes.io/projected/d3433a2f-3af8-44f9-bb23-67dac303c015-kube-api-access-8w5zm\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.628219 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-client-ca\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.628251 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-proxy-ca-bundles\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.628277 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-config\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.628321 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vrzp\" (UniqueName: \"kubernetes.io/projected/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-kube-api-access-9vrzp\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.628353 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-client-ca\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.628368 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-config\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.628395 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-serving-cert\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.628412 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3433a2f-3af8-44f9-bb23-67dac303c015-serving-cert\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.644764 4994 scope.go:117] "RemoveContainer" containerID="8a4cdb4758a8d66ac4d964d75c363e426be3ea4f0d96bd2b4370bc01dbce1a3f" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.645920 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" podStartSLOduration=1.645891338 podStartE2EDuration="1.645891338s" podCreationTimestamp="2026-03-10 00:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:13:42.638082153 +0000 UTC m=+436.811788902" watchObservedRunningTime="2026-03-10 00:13:42.645891338 +0000 UTC m=+436.819598107" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.671130 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7987b4b568-rd8h8"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.688942 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7987b4b568-rd8h8"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.696626 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzrd2"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.697938 4994 scope.go:117] "RemoveContainer" containerID="80f8634f7b8323c210693d621de8d8f6643dfa095b77ff7b2c7b90894cebf6e9" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.705231 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzrd2"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.712825 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wpd8k"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.716425 4994 scope.go:117] "RemoveContainer" containerID="fb93dc31c2039db01d5ef603fb8cadf69dce66a5580028d2b9b19cc49d45e88a" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.724422 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wpd8k"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.729583 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-config\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.729636 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vrzp\" (UniqueName: \"kubernetes.io/projected/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-kube-api-access-9vrzp\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.729668 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-client-ca\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.729691 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-config\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.729721 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-serving-cert\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.729744 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3433a2f-3af8-44f9-bb23-67dac303c015-serving-cert\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.729792 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w5zm\" (UniqueName: \"kubernetes.io/projected/d3433a2f-3af8-44f9-bb23-67dac303c015-kube-api-access-8w5zm\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.729843 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-client-ca\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.729886 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-proxy-ca-bundles\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.731425 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-client-ca\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.732297 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-config\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.733560 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-client-ca\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.734901 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-proxy-ca-bundles\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.742639 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3433a2f-3af8-44f9-bb23-67dac303c015-serving-cert\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.753534 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-serving-cert\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.753540 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w5zm\" (UniqueName: \"kubernetes.io/projected/d3433a2f-3af8-44f9-bb23-67dac303c015-kube-api-access-8w5zm\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.756715 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vrzp\" (UniqueName: \"kubernetes.io/projected/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-kube-api-access-9vrzp\") pod \"route-controller-manager-6846c44745-fnnz8\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.756741 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-config\") pod \"controller-manager-56bcc8b679-5m2f4\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.766448 4994 scope.go:117] "RemoveContainer" containerID="68b888f4cbe6324465ef819f044ee90195b5c11bf20ab3d53a6faef3352a36c6" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.789585 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bwzk5"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.792674 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bwzk5"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.797288 4994 scope.go:117] "RemoveContainer" containerID="c11c2f647a7cbb01d788b2a61a4106505500b1d0634fb464e68d4e4b2d159f7e" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.797552 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zv2kt"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.801188 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zv2kt"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.804171 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s7qcn"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.806963 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s7qcn"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.810945 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.814787 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6cb646fb-vc9sg"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.818990 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tgf68"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.822762 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tgf68"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.823406 4994 scope.go:117] "RemoveContainer" containerID="aae194d8e4c12d216b3165e539102c03919767ba8d6987e2d169c5147eb55863" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.825725 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t5kj4"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.829914 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t5kj4"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.838691 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c4tz9"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.844271 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c4tz9"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.848236 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrh9x"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.851748 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrh9x"] Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.854851 4994 scope.go:117] "RemoveContainer" containerID="8050e9fca1f15bb15fdfd5da3939a33eede922d4e417035502ed98660f52b965" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.860484 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.866378 4994 scope.go:117] "RemoveContainer" containerID="2ed048e8f43bfa8a5d112e7ab569e89e26c34e5ab5b69b6c77b3a42aea54c386" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.876353 4994 scope.go:117] "RemoveContainer" containerID="d9c63ca86073ed9073e0f89d99a6a3af753621532c12e53bb512c115a6852ded" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.886263 4994 scope.go:117] "RemoveContainer" containerID="c85cb3939718589608e5c84bc5f793f0ac91554e53660eb93bb8216dc7e11be6" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.901804 4994 scope.go:117] "RemoveContainer" containerID="757e8587473ddc7f21a46feaf304e66dbe443eeebcd4628d091bf2c8bec511d9" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.916120 4994 scope.go:117] "RemoveContainer" containerID="b4775c7cc3dc93dde45a7cd1c8d5a247763c1a3b907807a2f3655ae2194f4c42" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.934248 4994 scope.go:117] "RemoveContainer" containerID="ae53ee029113a53cb7678f130696b300909ff84d74bb704709a978c1c97a24b4" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.948389 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.960314 4994 scope.go:117] "RemoveContainer" containerID="979b74ea5a85ff4f5cee3a5418901bcd485b5c3af1206f8500f2ce239b83bc17" Mar 10 00:13:42 crc kubenswrapper[4994]: I0310 00:13:42.976137 4994 scope.go:117] "RemoveContainer" containerID="5520b611519111e180e88d1153308daf75771503d82a595371d6519dba75f44f" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.037444 4994 scope.go:117] "RemoveContainer" containerID="ed0179eebea68ea16a7ca2db77939a49213c1a39d735deda526e14363858855f" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.069818 4994 scope.go:117] "RemoveContainer" containerID="6e999aa11768c350f73b98e59c5adafa0716222aea76f3c3cc4ced602c5932bf" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.093708 4994 scope.go:117] "RemoveContainer" containerID="ca43f34122075e40b3a59998e1c0fdcc5eee5438f96f373d4f9e4b36228204ee" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.118027 4994 scope.go:117] "RemoveContainer" containerID="0872f65b4051bd51ddef4f02f90e17a29e60ddaf92da0bcd501b161644707129" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.135894 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8"] Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.144650 4994 scope.go:117] "RemoveContainer" containerID="5fa6127d5cb315c05287e97611be0a26bc929ad831b3970419c02c806f804ed6" Mar 10 00:13:43 crc kubenswrapper[4994]: W0310 00:13:43.163424 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3db4bc6b_389f_4ccf_b70e_eb97114f85e6.slice/crio-8fa7c46882c986280d70a7773e802034ebeacd07531eee3a2b9c40b0bf8c4823 WatchSource:0}: Error finding container 8fa7c46882c986280d70a7773e802034ebeacd07531eee3a2b9c40b0bf8c4823: Status 404 returned error can't find the container with id 8fa7c46882c986280d70a7773e802034ebeacd07531eee3a2b9c40b0bf8c4823 Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.167227 4994 scope.go:117] "RemoveContainer" containerID="f5717a500fcfc936aa966df3e8984d98f5ff5ab90d17718d04d543deea170e1a" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.194432 4994 scope.go:117] "RemoveContainer" containerID="06150f7e8e87fb17f0464494725fda3c7ecd07c48589662349d651d3700f6139" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.229178 4994 scope.go:117] "RemoveContainer" containerID="ea2fcc0ccbdd2d99bdd5e8db5934d29568b2c080e467f0b26270d4267b4ac275" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.242342 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56bcc8b679-5m2f4"] Mar 10 00:13:43 crc kubenswrapper[4994]: W0310 00:13:43.253261 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3433a2f_3af8_44f9_bb23_67dac303c015.slice/crio-fc95ae472f51e9ddbc3beb8980e7c48093c9c93d267c9c551e431299fcd96cf9 WatchSource:0}: Error finding container fc95ae472f51e9ddbc3beb8980e7c48093c9c93d267c9c551e431299fcd96cf9: Status 404 returned error can't find the container with id fc95ae472f51e9ddbc3beb8980e7c48093c9c93d267c9c551e431299fcd96cf9 Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.263634 4994 scope.go:117] "RemoveContainer" containerID="b12c2570f0f12ececa7d019201ed8ccc106ee186110fc56077d24c5532fccef4" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.615421 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" event={"ID":"3db4bc6b-389f-4ccf-b70e-eb97114f85e6","Type":"ContainerStarted","Data":"d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6"} Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.615471 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" event={"ID":"3db4bc6b-389f-4ccf-b70e-eb97114f85e6","Type":"ContainerStarted","Data":"8fa7c46882c986280d70a7773e802034ebeacd07531eee3a2b9c40b0bf8c4823"} Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.615713 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.622490 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" event={"ID":"d3433a2f-3af8-44f9-bb23-67dac303c015","Type":"ContainerStarted","Data":"d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175"} Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.622541 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" event={"ID":"d3433a2f-3af8-44f9-bb23-67dac303c015","Type":"ContainerStarted","Data":"fc95ae472f51e9ddbc3beb8980e7c48093c9c93d267c9c551e431299fcd96cf9"} Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.626927 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ppfwk" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.639991 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" podStartSLOduration=2.639973813 podStartE2EDuration="2.639973813s" podCreationTimestamp="2026-03-10 00:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:13:43.639209253 +0000 UTC m=+437.812916002" watchObservedRunningTime="2026-03-10 00:13:43.639973813 +0000 UTC m=+437.813680562" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.734102 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" podStartSLOduration=2.73408333 podStartE2EDuration="2.73408333s" podCreationTimestamp="2026-03-10 00:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:13:43.732304826 +0000 UTC m=+437.906011575" watchObservedRunningTime="2026-03-10 00:13:43.73408333 +0000 UTC m=+437.907790079" Mar 10 00:13:43 crc kubenswrapper[4994]: I0310 00:13:43.881796 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.561920 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0429fae4-1356-4d61-86a3-267f74f27636" path="/var/lib/kubelet/pods/0429fae4-1356-4d61-86a3-267f74f27636/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.562801 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3653335d-178c-4df8-a93d-4d19011298fe" path="/var/lib/kubelet/pods/3653335d-178c-4df8-a93d-4d19011298fe/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.563425 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64ec1b6f-2c0f-4cfc-be18-a2d311fae68c" path="/var/lib/kubelet/pods/64ec1b6f-2c0f-4cfc-be18-a2d311fae68c/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.564834 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6525b40b-1c23-4533-a025-4d86bc406f00" path="/var/lib/kubelet/pods/6525b40b-1c23-4533-a025-4d86bc406f00/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.565610 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76aa065c-ed60-4237-b36f-5ce2865256ff" path="/var/lib/kubelet/pods/76aa065c-ed60-4237-b36f-5ce2865256ff/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.566958 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a4dc2d-502f-4c05-ab76-1cc708f13006" path="/var/lib/kubelet/pods/a4a4dc2d-502f-4c05-ab76-1cc708f13006/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.567906 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab6cd76f-6272-4fcd-8c75-3040c45ef1b5" path="/var/lib/kubelet/pods/ab6cd76f-6272-4fcd-8c75-3040c45ef1b5/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.568693 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abe30cce-8379-4db8-838b-f48b4bc96621" path="/var/lib/kubelet/pods/abe30cce-8379-4db8-838b-f48b4bc96621/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.570176 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae5ec419-c993-43ff-b664-703b8b5a3d5a" path="/var/lib/kubelet/pods/ae5ec419-c993-43ff-b664-703b8b5a3d5a/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.570812 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b85bbdaa-daa8-4c69-abf9-9f1200eb07cd" path="/var/lib/kubelet/pods/b85bbdaa-daa8-4c69-abf9-9f1200eb07cd/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.572058 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdad0261-804d-41dc-8a25-48018f136c0f" path="/var/lib/kubelet/pods/fdad0261-804d-41dc-8a25-48018f136c0f/volumes" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.637178 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:44 crc kubenswrapper[4994]: I0310 00:13:44.644741 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:13:48 crc kubenswrapper[4994]: I0310 00:13:48.892724 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:13:48 crc kubenswrapper[4994]: I0310 00:13:48.892818 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.133862 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551694-pngfv"] Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.135036 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551694-pngfv" Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.137180 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.137631 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.137912 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.146010 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551694-pngfv"] Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.170776 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7m9d\" (UniqueName: \"kubernetes.io/projected/e91ae1c5-3f03-4439-b579-b828884a1b58-kube-api-access-l7m9d\") pod \"auto-csr-approver-29551694-pngfv\" (UID: \"e91ae1c5-3f03-4439-b579-b828884a1b58\") " pod="openshift-infra/auto-csr-approver-29551694-pngfv" Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.271659 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7m9d\" (UniqueName: \"kubernetes.io/projected/e91ae1c5-3f03-4439-b579-b828884a1b58-kube-api-access-l7m9d\") pod \"auto-csr-approver-29551694-pngfv\" (UID: \"e91ae1c5-3f03-4439-b579-b828884a1b58\") " pod="openshift-infra/auto-csr-approver-29551694-pngfv" Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.308486 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7m9d\" (UniqueName: \"kubernetes.io/projected/e91ae1c5-3f03-4439-b579-b828884a1b58-kube-api-access-l7m9d\") pod \"auto-csr-approver-29551694-pngfv\" (UID: \"e91ae1c5-3f03-4439-b579-b828884a1b58\") " pod="openshift-infra/auto-csr-approver-29551694-pngfv" Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.456662 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551694-pngfv" Mar 10 00:14:00 crc kubenswrapper[4994]: I0310 00:14:00.993453 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551694-pngfv"] Mar 10 00:14:01 crc kubenswrapper[4994]: I0310 00:14:01.747804 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551694-pngfv" event={"ID":"e91ae1c5-3f03-4439-b579-b828884a1b58","Type":"ContainerStarted","Data":"e68a4ce84d702f64ff9cdfbd72140a08214e1f8507c6d46e35baf2c9a4a20fa7"} Mar 10 00:14:02 crc kubenswrapper[4994]: I0310 00:14:02.762567 4994 generic.go:334] "Generic (PLEG): container finished" podID="e91ae1c5-3f03-4439-b579-b828884a1b58" containerID="ddb1ff554509065a0634194f412b0e90319b501a3735bf3cda900f518d12f147" exitCode=0 Mar 10 00:14:02 crc kubenswrapper[4994]: I0310 00:14:02.762655 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551694-pngfv" event={"ID":"e91ae1c5-3f03-4439-b579-b828884a1b58","Type":"ContainerDied","Data":"ddb1ff554509065a0634194f412b0e90319b501a3735bf3cda900f518d12f147"} Mar 10 00:14:04 crc kubenswrapper[4994]: I0310 00:14:04.195293 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551694-pngfv" Mar 10 00:14:04 crc kubenswrapper[4994]: I0310 00:14:04.323246 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7m9d\" (UniqueName: \"kubernetes.io/projected/e91ae1c5-3f03-4439-b579-b828884a1b58-kube-api-access-l7m9d\") pod \"e91ae1c5-3f03-4439-b579-b828884a1b58\" (UID: \"e91ae1c5-3f03-4439-b579-b828884a1b58\") " Mar 10 00:14:04 crc kubenswrapper[4994]: I0310 00:14:04.330262 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e91ae1c5-3f03-4439-b579-b828884a1b58-kube-api-access-l7m9d" (OuterVolumeSpecName: "kube-api-access-l7m9d") pod "e91ae1c5-3f03-4439-b579-b828884a1b58" (UID: "e91ae1c5-3f03-4439-b579-b828884a1b58"). InnerVolumeSpecName "kube-api-access-l7m9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:14:04 crc kubenswrapper[4994]: I0310 00:14:04.424011 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7m9d\" (UniqueName: \"kubernetes.io/projected/e91ae1c5-3f03-4439-b579-b828884a1b58-kube-api-access-l7m9d\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:04 crc kubenswrapper[4994]: I0310 00:14:04.777373 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551694-pngfv" event={"ID":"e91ae1c5-3f03-4439-b579-b828884a1b58","Type":"ContainerDied","Data":"e68a4ce84d702f64ff9cdfbd72140a08214e1f8507c6d46e35baf2c9a4a20fa7"} Mar 10 00:14:04 crc kubenswrapper[4994]: I0310 00:14:04.777416 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e68a4ce84d702f64ff9cdfbd72140a08214e1f8507c6d46e35baf2c9a4a20fa7" Mar 10 00:14:04 crc kubenswrapper[4994]: I0310 00:14:04.777484 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551694-pngfv" Mar 10 00:14:05 crc kubenswrapper[4994]: I0310 00:14:05.249352 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551688-9zsf6"] Mar 10 00:14:05 crc kubenswrapper[4994]: I0310 00:14:05.254117 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551688-9zsf6"] Mar 10 00:14:06 crc kubenswrapper[4994]: I0310 00:14:06.566576 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1456dd8-5038-4bcc-8f19-51325ac84c02" path="/var/lib/kubelet/pods/a1456dd8-5038-4bcc-8f19-51325ac84c02/volumes" Mar 10 00:14:11 crc kubenswrapper[4994]: I0310 00:14:11.348830 4994 scope.go:117] "RemoveContainer" containerID="4c883bbf75f6eed116bbd752bb24651880ad9f3e32fa3f04163ece7d79b5b7c0" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.358333 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56bcc8b679-5m2f4"] Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.358862 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" podUID="d3433a2f-3af8-44f9-bb23-67dac303c015" containerName="controller-manager" containerID="cri-o://d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175" gracePeriod=30 Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.457236 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8"] Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.458079 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" podUID="3db4bc6b-389f-4ccf-b70e-eb97114f85e6" containerName="route-controller-manager" containerID="cri-o://d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6" gracePeriod=30 Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.817558 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.822607 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.850082 4994 generic.go:334] "Generic (PLEG): container finished" podID="3db4bc6b-389f-4ccf-b70e-eb97114f85e6" containerID="d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6" exitCode=0 Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.850140 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" event={"ID":"3db4bc6b-389f-4ccf-b70e-eb97114f85e6","Type":"ContainerDied","Data":"d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6"} Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.850166 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" event={"ID":"3db4bc6b-389f-4ccf-b70e-eb97114f85e6","Type":"ContainerDied","Data":"8fa7c46882c986280d70a7773e802034ebeacd07531eee3a2b9c40b0bf8c4823"} Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.850183 4994 scope.go:117] "RemoveContainer" containerID="d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.850272 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.852410 4994 generic.go:334] "Generic (PLEG): container finished" podID="d3433a2f-3af8-44f9-bb23-67dac303c015" containerID="d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175" exitCode=0 Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.852441 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" event={"ID":"d3433a2f-3af8-44f9-bb23-67dac303c015","Type":"ContainerDied","Data":"d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175"} Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.852461 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" event={"ID":"d3433a2f-3af8-44f9-bb23-67dac303c015","Type":"ContainerDied","Data":"fc95ae472f51e9ddbc3beb8980e7c48093c9c93d267c9c551e431299fcd96cf9"} Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.852505 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56bcc8b679-5m2f4" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.862665 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-proxy-ca-bundles\") pod \"d3433a2f-3af8-44f9-bb23-67dac303c015\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.862706 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-config\") pod \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.862753 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-serving-cert\") pod \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.862804 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-client-ca\") pod \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.862825 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vrzp\" (UniqueName: \"kubernetes.io/projected/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-kube-api-access-9vrzp\") pod \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\" (UID: \"3db4bc6b-389f-4ccf-b70e-eb97114f85e6\") " Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.862867 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3433a2f-3af8-44f9-bb23-67dac303c015-serving-cert\") pod \"d3433a2f-3af8-44f9-bb23-67dac303c015\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.862906 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-config\") pod \"d3433a2f-3af8-44f9-bb23-67dac303c015\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.862934 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w5zm\" (UniqueName: \"kubernetes.io/projected/d3433a2f-3af8-44f9-bb23-67dac303c015-kube-api-access-8w5zm\") pod \"d3433a2f-3af8-44f9-bb23-67dac303c015\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.862968 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-client-ca\") pod \"d3433a2f-3af8-44f9-bb23-67dac303c015\" (UID: \"d3433a2f-3af8-44f9-bb23-67dac303c015\") " Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.863986 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-client-ca" (OuterVolumeSpecName: "client-ca") pod "d3433a2f-3af8-44f9-bb23-67dac303c015" (UID: "d3433a2f-3af8-44f9-bb23-67dac303c015"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.864313 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d3433a2f-3af8-44f9-bb23-67dac303c015" (UID: "d3433a2f-3af8-44f9-bb23-67dac303c015"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.864764 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-config" (OuterVolumeSpecName: "config") pod "3db4bc6b-389f-4ccf-b70e-eb97114f85e6" (UID: "3db4bc6b-389f-4ccf-b70e-eb97114f85e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.866494 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-client-ca" (OuterVolumeSpecName: "client-ca") pod "3db4bc6b-389f-4ccf-b70e-eb97114f85e6" (UID: "3db4bc6b-389f-4ccf-b70e-eb97114f85e6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.866595 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-config" (OuterVolumeSpecName: "config") pod "d3433a2f-3af8-44f9-bb23-67dac303c015" (UID: "d3433a2f-3af8-44f9-bb23-67dac303c015"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.870706 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3db4bc6b-389f-4ccf-b70e-eb97114f85e6" (UID: "3db4bc6b-389f-4ccf-b70e-eb97114f85e6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.874136 4994 scope.go:117] "RemoveContainer" containerID="d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6" Mar 10 00:14:15 crc kubenswrapper[4994]: E0310 00:14:15.875769 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6\": container with ID starting with d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6 not found: ID does not exist" containerID="d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.875981 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6"} err="failed to get container status \"d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6\": rpc error: code = NotFound desc = could not find container \"d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6\": container with ID starting with d749d7bbcab6c6cda552162f80c82661a38e32f6ff993eacd03916fe72ec00f6 not found: ID does not exist" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.876027 4994 scope.go:117] "RemoveContainer" containerID="d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.877361 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3433a2f-3af8-44f9-bb23-67dac303c015-kube-api-access-8w5zm" (OuterVolumeSpecName: "kube-api-access-8w5zm") pod "d3433a2f-3af8-44f9-bb23-67dac303c015" (UID: "d3433a2f-3af8-44f9-bb23-67dac303c015"). InnerVolumeSpecName "kube-api-access-8w5zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.878384 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-kube-api-access-9vrzp" (OuterVolumeSpecName: "kube-api-access-9vrzp") pod "3db4bc6b-389f-4ccf-b70e-eb97114f85e6" (UID: "3db4bc6b-389f-4ccf-b70e-eb97114f85e6"). InnerVolumeSpecName "kube-api-access-9vrzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.886166 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3433a2f-3af8-44f9-bb23-67dac303c015-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d3433a2f-3af8-44f9-bb23-67dac303c015" (UID: "d3433a2f-3af8-44f9-bb23-67dac303c015"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.895517 4994 scope.go:117] "RemoveContainer" containerID="d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175" Mar 10 00:14:15 crc kubenswrapper[4994]: E0310 00:14:15.895994 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175\": container with ID starting with d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175 not found: ID does not exist" containerID="d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.896032 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175"} err="failed to get container status \"d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175\": rpc error: code = NotFound desc = could not find container \"d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175\": container with ID starting with d6e16aefe8aa51d978ab1253340cdd490d172e06363bd1db4a853bd420963175 not found: ID does not exist" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.964772 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3433a2f-3af8-44f9-bb23-67dac303c015-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.964803 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.964814 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w5zm\" (UniqueName: \"kubernetes.io/projected/d3433a2f-3af8-44f9-bb23-67dac303c015-kube-api-access-8w5zm\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.964822 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.964829 4994 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3433a2f-3af8-44f9-bb23-67dac303c015-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.964838 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.964888 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.964897 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:15 crc kubenswrapper[4994]: I0310 00:14:15.964904 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vrzp\" (UniqueName: \"kubernetes.io/projected/3db4bc6b-389f-4ccf-b70e-eb97114f85e6-kube-api-access-9vrzp\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.000692 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fxpkq"] Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.176426 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8"] Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.180086 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6846c44745-fnnz8"] Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.190932 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56bcc8b679-5m2f4"] Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.195212 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-56bcc8b679-5m2f4"] Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.551044 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7df6847577-hghv8"] Mar 10 00:14:16 crc kubenswrapper[4994]: E0310 00:14:16.551237 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3433a2f-3af8-44f9-bb23-67dac303c015" containerName="controller-manager" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.551248 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3433a2f-3af8-44f9-bb23-67dac303c015" containerName="controller-manager" Mar 10 00:14:16 crc kubenswrapper[4994]: E0310 00:14:16.551262 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e91ae1c5-3f03-4439-b579-b828884a1b58" containerName="oc" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.551268 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="e91ae1c5-3f03-4439-b579-b828884a1b58" containerName="oc" Mar 10 00:14:16 crc kubenswrapper[4994]: E0310 00:14:16.551284 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db4bc6b-389f-4ccf-b70e-eb97114f85e6" containerName="route-controller-manager" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.551290 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db4bc6b-389f-4ccf-b70e-eb97114f85e6" containerName="route-controller-manager" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.551368 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3433a2f-3af8-44f9-bb23-67dac303c015" containerName="controller-manager" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.551382 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db4bc6b-389f-4ccf-b70e-eb97114f85e6" containerName="route-controller-manager" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.551395 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="e91ae1c5-3f03-4439-b579-b828884a1b58" containerName="oc" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.551722 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.554042 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.554278 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.554280 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.554530 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.555339 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.557386 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.568694 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.572868 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3db4bc6b-389f-4ccf-b70e-eb97114f85e6" path="/var/lib/kubelet/pods/3db4bc6b-389f-4ccf-b70e-eb97114f85e6/volumes" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.573800 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9de70537-1ea7-4305-9674-34a0f6493916-config\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.573834 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9de70537-1ea7-4305-9674-34a0f6493916-serving-cert\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.573903 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9de70537-1ea7-4305-9674-34a0f6493916-proxy-ca-bundles\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.573935 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp8nj\" (UniqueName: \"kubernetes.io/projected/9de70537-1ea7-4305-9674-34a0f6493916-kube-api-access-zp8nj\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.573953 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9de70537-1ea7-4305-9674-34a0f6493916-client-ca\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.574132 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3433a2f-3af8-44f9-bb23-67dac303c015" path="/var/lib/kubelet/pods/d3433a2f-3af8-44f9-bb23-67dac303c015/volumes" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.574924 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt"] Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.578337 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7df6847577-hghv8"] Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.578377 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt"] Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.578465 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.582377 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.582692 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.582722 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.583299 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.583546 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.583660 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.677825 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9de70537-1ea7-4305-9674-34a0f6493916-config\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.677946 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9de70537-1ea7-4305-9674-34a0f6493916-serving-cert\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.678079 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-client-ca\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.678114 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9de70537-1ea7-4305-9674-34a0f6493916-proxy-ca-bundles\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.678171 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-config\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.678196 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9szqn\" (UniqueName: \"kubernetes.io/projected/515d1a06-2d73-4eb3-931c-add8a5c7940f-kube-api-access-9szqn\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.678239 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/515d1a06-2d73-4eb3-931c-add8a5c7940f-serving-cert\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.678261 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp8nj\" (UniqueName: \"kubernetes.io/projected/9de70537-1ea7-4305-9674-34a0f6493916-kube-api-access-zp8nj\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.678290 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9de70537-1ea7-4305-9674-34a0f6493916-client-ca\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.679281 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9de70537-1ea7-4305-9674-34a0f6493916-client-ca\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.681095 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9de70537-1ea7-4305-9674-34a0f6493916-config\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.683212 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9de70537-1ea7-4305-9674-34a0f6493916-proxy-ca-bundles\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.696359 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9de70537-1ea7-4305-9674-34a0f6493916-serving-cert\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.711142 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp8nj\" (UniqueName: \"kubernetes.io/projected/9de70537-1ea7-4305-9674-34a0f6493916-kube-api-access-zp8nj\") pod \"controller-manager-7df6847577-hghv8\" (UID: \"9de70537-1ea7-4305-9674-34a0f6493916\") " pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.778954 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-client-ca\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.779033 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-config\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.779064 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9szqn\" (UniqueName: \"kubernetes.io/projected/515d1a06-2d73-4eb3-931c-add8a5c7940f-kube-api-access-9szqn\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.779098 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/515d1a06-2d73-4eb3-931c-add8a5c7940f-serving-cert\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.779954 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-client-ca\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.781563 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-config\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.784478 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/515d1a06-2d73-4eb3-931c-add8a5c7940f-serving-cert\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.807484 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9szqn\" (UniqueName: \"kubernetes.io/projected/515d1a06-2d73-4eb3-931c-add8a5c7940f-kube-api-access-9szqn\") pod \"route-controller-manager-8d84b6f7f-z54gt\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.882650 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:16 crc kubenswrapper[4994]: I0310 00:14:16.892795 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.071125 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt"] Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.110700 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7df6847577-hghv8"] Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.867774 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" event={"ID":"9de70537-1ea7-4305-9674-34a0f6493916","Type":"ContainerStarted","Data":"fa4f6aad3e2bd55dec5562201d04116607e7ec340427792efd6db5ffce0f6fa5"} Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.868012 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.868024 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" event={"ID":"9de70537-1ea7-4305-9674-34a0f6493916","Type":"ContainerStarted","Data":"c94e6296250c8576c6a3e6609eef102162e9c547d0081e5dae6fc85927973cb3"} Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.868802 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" event={"ID":"515d1a06-2d73-4eb3-931c-add8a5c7940f","Type":"ContainerStarted","Data":"a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1"} Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.868838 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" event={"ID":"515d1a06-2d73-4eb3-931c-add8a5c7940f","Type":"ContainerStarted","Data":"074e301a7c36bb2cd8c0f431623fca75cce6ab78360d5f6f9180331c04931148"} Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.869079 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.873304 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.873683 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" Mar 10 00:14:17 crc kubenswrapper[4994]: I0310 00:14:17.905733 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7df6847577-hghv8" podStartSLOduration=2.90572002 podStartE2EDuration="2.90572002s" podCreationTimestamp="2026-03-10 00:14:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:14:17.887044286 +0000 UTC m=+472.060751035" watchObservedRunningTime="2026-03-10 00:14:17.90572002 +0000 UTC m=+472.079426759" Mar 10 00:14:18 crc kubenswrapper[4994]: I0310 00:14:18.892947 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:14:18 crc kubenswrapper[4994]: I0310 00:14:18.893438 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.068384 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" podStartSLOduration=6.068356991 podStartE2EDuration="6.068356991s" podCreationTimestamp="2026-03-10 00:14:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:14:17.926386167 +0000 UTC m=+472.100092926" watchObservedRunningTime="2026-03-10 00:14:21.068356991 +0000 UTC m=+475.242063780" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.069068 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt"] Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.069372 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" podUID="515d1a06-2d73-4eb3-931c-add8a5c7940f" containerName="route-controller-manager" containerID="cri-o://a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1" gracePeriod=30 Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.509047 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.636166 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9szqn\" (UniqueName: \"kubernetes.io/projected/515d1a06-2d73-4eb3-931c-add8a5c7940f-kube-api-access-9szqn\") pod \"515d1a06-2d73-4eb3-931c-add8a5c7940f\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.636284 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/515d1a06-2d73-4eb3-931c-add8a5c7940f-serving-cert\") pod \"515d1a06-2d73-4eb3-931c-add8a5c7940f\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.636326 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-config\") pod \"515d1a06-2d73-4eb3-931c-add8a5c7940f\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.636441 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-client-ca\") pod \"515d1a06-2d73-4eb3-931c-add8a5c7940f\" (UID: \"515d1a06-2d73-4eb3-931c-add8a5c7940f\") " Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.637438 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-client-ca" (OuterVolumeSpecName: "client-ca") pod "515d1a06-2d73-4eb3-931c-add8a5c7940f" (UID: "515d1a06-2d73-4eb3-931c-add8a5c7940f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.638057 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-config" (OuterVolumeSpecName: "config") pod "515d1a06-2d73-4eb3-931c-add8a5c7940f" (UID: "515d1a06-2d73-4eb3-931c-add8a5c7940f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.642945 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/515d1a06-2d73-4eb3-931c-add8a5c7940f-kube-api-access-9szqn" (OuterVolumeSpecName: "kube-api-access-9szqn") pod "515d1a06-2d73-4eb3-931c-add8a5c7940f" (UID: "515d1a06-2d73-4eb3-931c-add8a5c7940f"). InnerVolumeSpecName "kube-api-access-9szqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.647818 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/515d1a06-2d73-4eb3-931c-add8a5c7940f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "515d1a06-2d73-4eb3-931c-add8a5c7940f" (UID: "515d1a06-2d73-4eb3-931c-add8a5c7940f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.746653 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.746703 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9szqn\" (UniqueName: \"kubernetes.io/projected/515d1a06-2d73-4eb3-931c-add8a5c7940f-kube-api-access-9szqn\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.746718 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/515d1a06-2d73-4eb3-931c-add8a5c7940f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.746728 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/515d1a06-2d73-4eb3-931c-add8a5c7940f-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.896678 4994 generic.go:334] "Generic (PLEG): container finished" podID="515d1a06-2d73-4eb3-931c-add8a5c7940f" containerID="a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1" exitCode=0 Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.896847 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.896915 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" event={"ID":"515d1a06-2d73-4eb3-931c-add8a5c7940f","Type":"ContainerDied","Data":"a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1"} Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.897782 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt" event={"ID":"515d1a06-2d73-4eb3-931c-add8a5c7940f","Type":"ContainerDied","Data":"074e301a7c36bb2cd8c0f431623fca75cce6ab78360d5f6f9180331c04931148"} Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.897823 4994 scope.go:117] "RemoveContainer" containerID="a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.920412 4994 scope.go:117] "RemoveContainer" containerID="a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1" Mar 10 00:14:21 crc kubenswrapper[4994]: E0310 00:14:21.921047 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1\": container with ID starting with a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1 not found: ID does not exist" containerID="a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.921108 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1"} err="failed to get container status \"a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1\": rpc error: code = NotFound desc = could not find container \"a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1\": container with ID starting with a7a69bf0cdb86729d9ae5ef6776c1a926c3f159b29d205d5fb87430ddde244d1 not found: ID does not exist" Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.953571 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt"] Mar 10 00:14:21 crc kubenswrapper[4994]: I0310 00:14:21.961410 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d84b6f7f-z54gt"] Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.560755 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="515d1a06-2d73-4eb3-931c-add8a5c7940f" path="/var/lib/kubelet/pods/515d1a06-2d73-4eb3-931c-add8a5c7940f/volumes" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.561171 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4"] Mar 10 00:14:22 crc kubenswrapper[4994]: E0310 00:14:22.561348 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="515d1a06-2d73-4eb3-931c-add8a5c7940f" containerName="route-controller-manager" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.561359 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="515d1a06-2d73-4eb3-931c-add8a5c7940f" containerName="route-controller-manager" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.561453 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="515d1a06-2d73-4eb3-931c-add8a5c7940f" containerName="route-controller-manager" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.564413 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.567579 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.567782 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.567833 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4"] Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.567843 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.567965 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.568048 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.568446 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.758100 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d43a41-1177-47b1-ac5f-3d4309491587-serving-cert\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.758200 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-client-ca\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.758258 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tdtw\" (UniqueName: \"kubernetes.io/projected/d7d43a41-1177-47b1-ac5f-3d4309491587-kube-api-access-4tdtw\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.758360 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-config\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.859259 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d43a41-1177-47b1-ac5f-3d4309491587-serving-cert\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.859307 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-client-ca\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.859332 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tdtw\" (UniqueName: \"kubernetes.io/projected/d7d43a41-1177-47b1-ac5f-3d4309491587-kube-api-access-4tdtw\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.859360 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-config\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.860501 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-config\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.860854 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-client-ca\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.869171 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d43a41-1177-47b1-ac5f-3d4309491587-serving-cert\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.878575 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tdtw\" (UniqueName: \"kubernetes.io/projected/d7d43a41-1177-47b1-ac5f-3d4309491587-kube-api-access-4tdtw\") pod \"route-controller-manager-84db66d99d-vvln4\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:22 crc kubenswrapper[4994]: I0310 00:14:22.891420 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:23 crc kubenswrapper[4994]: I0310 00:14:23.329676 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4"] Mar 10 00:14:23 crc kubenswrapper[4994]: I0310 00:14:23.916699 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" event={"ID":"d7d43a41-1177-47b1-ac5f-3d4309491587","Type":"ContainerStarted","Data":"92d6fcd0f4f7fb4ba9168b84d9692bf78d32d2b1f295642d2ecc92e5c84e1694"} Mar 10 00:14:23 crc kubenswrapper[4994]: I0310 00:14:23.917124 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" event={"ID":"d7d43a41-1177-47b1-ac5f-3d4309491587","Type":"ContainerStarted","Data":"bf8f57540b2c9250bb5294616cdfe2a18d71874683f0e928b618b3a35928461f"} Mar 10 00:14:23 crc kubenswrapper[4994]: I0310 00:14:23.918950 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:23 crc kubenswrapper[4994]: I0310 00:14:23.935156 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" podStartSLOduration=2.935130184 podStartE2EDuration="2.935130184s" podCreationTimestamp="2026-03-10 00:14:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:14:23.933285946 +0000 UTC m=+478.106992735" watchObservedRunningTime="2026-03-10 00:14:23.935130184 +0000 UTC m=+478.108836973" Mar 10 00:14:24 crc kubenswrapper[4994]: I0310 00:14:24.072234 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.034520 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" podUID="903778b5-0c60-42d6-8773-a1345817fe1f" containerName="oauth-openshift" containerID="cri-o://8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3" gracePeriod=15 Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.518497 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.582120 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7c96dbfd49-scp27"] Mar 10 00:14:41 crc kubenswrapper[4994]: E0310 00:14:41.582382 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="903778b5-0c60-42d6-8773-a1345817fe1f" containerName="oauth-openshift" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.582403 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="903778b5-0c60-42d6-8773-a1345817fe1f" containerName="oauth-openshift" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.582565 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="903778b5-0c60-42d6-8773-a1345817fe1f" containerName="oauth-openshift" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.583169 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.597931 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7c96dbfd49-scp27"] Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.706292 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/903778b5-0c60-42d6-8773-a1345817fe1f-audit-dir\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.706417 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/903778b5-0c60-42d6-8773-a1345817fe1f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.706699 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-ocp-branding-template\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.706753 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-service-ca\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.706839 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-provider-selection\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.706905 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-idp-0-file-data\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.706986 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48dxj\" (UniqueName: \"kubernetes.io/projected/903778b5-0c60-42d6-8773-a1345817fe1f-kube-api-access-48dxj\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707052 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-session\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707111 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-trusted-ca-bundle\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707165 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-error\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707212 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-cliconfig\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707254 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-audit-policies\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707304 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-router-certs\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707347 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-login\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707382 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-serving-cert\") pod \"903778b5-0c60-42d6-8773-a1345817fe1f\" (UID: \"903778b5-0c60-42d6-8773-a1345817fe1f\") " Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707596 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-template-login\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707652 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdwtj\" (UniqueName: \"kubernetes.io/projected/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-kube-api-access-vdwtj\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707714 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-audit-dir\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707749 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-service-ca\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707846 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-audit-policies\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707925 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.707969 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-router-certs\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.708029 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.708065 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.708136 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.708188 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-template-error\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.708238 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.709014 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.709041 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.710227 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.710448 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.710584 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-session\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.710617 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.710846 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.710998 4994 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.711097 4994 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/903778b5-0c60-42d6-8773-a1345817fe1f-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.711123 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.714835 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/903778b5-0c60-42d6-8773-a1345817fe1f-kube-api-access-48dxj" (OuterVolumeSpecName: "kube-api-access-48dxj") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "kube-api-access-48dxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.716073 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.716143 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.719267 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.719688 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.721262 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.722013 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.722707 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.725375 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "903778b5-0c60-42d6-8773-a1345817fe1f" (UID: "903778b5-0c60-42d6-8773-a1345817fe1f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812313 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812366 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-template-error\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812396 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812416 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812445 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-session\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812477 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-template-login\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812500 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdwtj\" (UniqueName: \"kubernetes.io/projected/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-kube-api-access-vdwtj\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812523 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-audit-dir\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812542 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-service-ca\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812568 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-audit-policies\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812590 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812616 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-router-certs\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812643 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812666 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812718 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812734 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812747 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812759 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48dxj\" (UniqueName: \"kubernetes.io/projected/903778b5-0c60-42d6-8773-a1345817fe1f-kube-api-access-48dxj\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812771 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812785 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812798 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812812 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812824 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.812837 4994 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/903778b5-0c60-42d6-8773-a1345817fe1f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.813199 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-audit-dir\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.813943 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.815618 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-service-ca\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.815748 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-audit-policies\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.816049 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-template-error\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.816638 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.819007 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.819055 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.819772 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.820036 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-router-certs\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.820208 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.823304 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-system-session\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.823805 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-v4-0-config-user-template-login\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.832284 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdwtj\" (UniqueName: \"kubernetes.io/projected/f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b-kube-api-access-vdwtj\") pod \"oauth-openshift-7c96dbfd49-scp27\" (UID: \"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b\") " pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:41 crc kubenswrapper[4994]: I0310 00:14:41.919997 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.046575 4994 generic.go:334] "Generic (PLEG): container finished" podID="903778b5-0c60-42d6-8773-a1345817fe1f" containerID="8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3" exitCode=0 Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.047022 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" event={"ID":"903778b5-0c60-42d6-8773-a1345817fe1f","Type":"ContainerDied","Data":"8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3"} Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.047118 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" event={"ID":"903778b5-0c60-42d6-8773-a1345817fe1f","Type":"ContainerDied","Data":"579d8e47d1cae1e88f269db0e29bcd43ee29c56b451d1f988d01fa0b8de660ec"} Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.047680 4994 scope.go:117] "RemoveContainer" containerID="8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3" Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.049122 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fxpkq" Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.089183 4994 scope.go:117] "RemoveContainer" containerID="8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3" Mar 10 00:14:42 crc kubenswrapper[4994]: E0310 00:14:42.089629 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3\": container with ID starting with 8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3 not found: ID does not exist" containerID="8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3" Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.089654 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3"} err="failed to get container status \"8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3\": rpc error: code = NotFound desc = could not find container \"8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3\": container with ID starting with 8c6f819fd6497ca6ad24b48c2f03f892324d4851311bb1c5697f2ee4c822c3f3 not found: ID does not exist" Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.106031 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fxpkq"] Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.111662 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fxpkq"] Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.382732 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7c96dbfd49-scp27"] Mar 10 00:14:42 crc kubenswrapper[4994]: W0310 00:14:42.391204 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf56949c3_edcf_43cf_bb6f_a3e49d3fcb8b.slice/crio-03b27254f67e03c905777128a2a600d6ed7c7516c3082c89985fab6e592b8012 WatchSource:0}: Error finding container 03b27254f67e03c905777128a2a600d6ed7c7516c3082c89985fab6e592b8012: Status 404 returned error can't find the container with id 03b27254f67e03c905777128a2a600d6ed7c7516c3082c89985fab6e592b8012 Mar 10 00:14:42 crc kubenswrapper[4994]: I0310 00:14:42.563563 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="903778b5-0c60-42d6-8773-a1345817fe1f" path="/var/lib/kubelet/pods/903778b5-0c60-42d6-8773-a1345817fe1f/volumes" Mar 10 00:14:43 crc kubenswrapper[4994]: I0310 00:14:43.055736 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" event={"ID":"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b","Type":"ContainerStarted","Data":"e605f0ae01431c850277b1b39ca954ccb4e6f85a736f0d82e023f7d83dba93c3"} Mar 10 00:14:43 crc kubenswrapper[4994]: I0310 00:14:43.055814 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" event={"ID":"f56949c3-edcf-43cf-bb6f-a3e49d3fcb8b","Type":"ContainerStarted","Data":"03b27254f67e03c905777128a2a600d6ed7c7516c3082c89985fab6e592b8012"} Mar 10 00:14:43 crc kubenswrapper[4994]: I0310 00:14:43.056031 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:43 crc kubenswrapper[4994]: I0310 00:14:43.087432 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" podStartSLOduration=27.087397746 podStartE2EDuration="27.087397746s" podCreationTimestamp="2026-03-10 00:14:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:14:43.085305522 +0000 UTC m=+497.259012281" watchObservedRunningTime="2026-03-10 00:14:43.087397746 +0000 UTC m=+497.261104525" Mar 10 00:14:43 crc kubenswrapper[4994]: I0310 00:14:43.398371 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7c96dbfd49-scp27" Mar 10 00:14:48 crc kubenswrapper[4994]: I0310 00:14:48.892944 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:14:48 crc kubenswrapper[4994]: I0310 00:14:48.893766 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:14:48 crc kubenswrapper[4994]: I0310 00:14:48.893849 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:14:48 crc kubenswrapper[4994]: I0310 00:14:48.894806 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e0758e06b050c72a5ba1ca15578add547da69884a82997478d49f051fd653d6f"} pod="openshift-machine-config-operator/machine-config-daemon-kfljj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:14:48 crc kubenswrapper[4994]: I0310 00:14:48.894953 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" containerID="cri-o://e0758e06b050c72a5ba1ca15578add547da69884a82997478d49f051fd653d6f" gracePeriod=600 Mar 10 00:14:49 crc kubenswrapper[4994]: I0310 00:14:49.109226 4994 generic.go:334] "Generic (PLEG): container finished" podID="ced5d66d-39df-4267-b801-e1e60d517ace" containerID="e0758e06b050c72a5ba1ca15578add547da69884a82997478d49f051fd653d6f" exitCode=0 Mar 10 00:14:49 crc kubenswrapper[4994]: I0310 00:14:49.109299 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerDied","Data":"e0758e06b050c72a5ba1ca15578add547da69884a82997478d49f051fd653d6f"} Mar 10 00:14:49 crc kubenswrapper[4994]: I0310 00:14:49.109359 4994 scope.go:117] "RemoveContainer" containerID="345c0584a7496afccc0329f382c99e267d5962a6733da1806cd05dbb15bf54c6" Mar 10 00:14:50 crc kubenswrapper[4994]: I0310 00:14:50.119663 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"3a58b30808ba3fd3b4a9259ace5f595c7d1bd5910098d24eb4a7ede149499cfa"} Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.628843 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-r27vp"] Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.630213 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.652199 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-r27vp"] Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.798583 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11d933f7-4a58-4a81-8916-647ed943e26d-registry-tls\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.798638 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11d933f7-4a58-4a81-8916-647ed943e26d-registry-certificates\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.798660 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11d933f7-4a58-4a81-8916-647ed943e26d-trusted-ca\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.798680 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11d933f7-4a58-4a81-8916-647ed943e26d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.798711 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11d933f7-4a58-4a81-8916-647ed943e26d-bound-sa-token\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.798760 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11d933f7-4a58-4a81-8916-647ed943e26d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.798791 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhbs5\" (UniqueName: \"kubernetes.io/projected/11d933f7-4a58-4a81-8916-647ed943e26d-kube-api-access-zhbs5\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.798840 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.847648 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.900157 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11d933f7-4a58-4a81-8916-647ed943e26d-bound-sa-token\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.900214 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11d933f7-4a58-4a81-8916-647ed943e26d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.900251 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhbs5\" (UniqueName: \"kubernetes.io/projected/11d933f7-4a58-4a81-8916-647ed943e26d-kube-api-access-zhbs5\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.900333 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11d933f7-4a58-4a81-8916-647ed943e26d-registry-tls\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.900364 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11d933f7-4a58-4a81-8916-647ed943e26d-registry-certificates\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.900384 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11d933f7-4a58-4a81-8916-647ed943e26d-trusted-ca\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.900404 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11d933f7-4a58-4a81-8916-647ed943e26d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.901537 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11d933f7-4a58-4a81-8916-647ed943e26d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.905704 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11d933f7-4a58-4a81-8916-647ed943e26d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.906435 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11d933f7-4a58-4a81-8916-647ed943e26d-registry-tls\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.906761 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11d933f7-4a58-4a81-8916-647ed943e26d-trusted-ca\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.907063 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11d933f7-4a58-4a81-8916-647ed943e26d-registry-certificates\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.918961 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11d933f7-4a58-4a81-8916-647ed943e26d-bound-sa-token\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.927198 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhbs5\" (UniqueName: \"kubernetes.io/projected/11d933f7-4a58-4a81-8916-647ed943e26d-kube-api-access-zhbs5\") pod \"image-registry-66df7c8f76-r27vp\" (UID: \"11d933f7-4a58-4a81-8916-647ed943e26d\") " pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:52 crc kubenswrapper[4994]: I0310 00:14:52.947847 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:14:53 crc kubenswrapper[4994]: I0310 00:14:53.414539 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-r27vp"] Mar 10 00:14:53 crc kubenswrapper[4994]: W0310 00:14:53.425491 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11d933f7_4a58_4a81_8916_647ed943e26d.slice/crio-eac6e2a1bb3b63843499e6a6ff9a1655f2d3b0701f46affb8418c3cf678c8be4 WatchSource:0}: Error finding container eac6e2a1bb3b63843499e6a6ff9a1655f2d3b0701f46affb8418c3cf678c8be4: Status 404 returned error can't find the container with id eac6e2a1bb3b63843499e6a6ff9a1655f2d3b0701f46affb8418c3cf678c8be4 Mar 10 00:14:54 crc kubenswrapper[4994]: I0310 00:14:54.146631 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" event={"ID":"11d933f7-4a58-4a81-8916-647ed943e26d","Type":"ContainerStarted","Data":"55950f1339f17704e52315e7ae0d927736339fd0ab32d668f10cffaa923d280d"} Mar 10 00:14:54 crc kubenswrapper[4994]: I0310 00:14:54.146680 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" event={"ID":"11d933f7-4a58-4a81-8916-647ed943e26d","Type":"ContainerStarted","Data":"eac6e2a1bb3b63843499e6a6ff9a1655f2d3b0701f46affb8418c3cf678c8be4"} Mar 10 00:14:54 crc kubenswrapper[4994]: I0310 00:14:54.146772 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.133554 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" podStartSLOduration=8.13352918 podStartE2EDuration="8.13352918s" podCreationTimestamp="2026-03-10 00:14:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:14:54.171484605 +0000 UTC m=+508.345191354" watchObservedRunningTime="2026-03-10 00:15:00.13352918 +0000 UTC m=+514.307235969" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.136636 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv"] Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.137690 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.140290 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.143976 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.157995 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv"] Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.299595 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec6570de-1422-4eaf-a42d-dc5f62f91eba-secret-volume\") pod \"collect-profiles-29551695-642fv\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.299661 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec6570de-1422-4eaf-a42d-dc5f62f91eba-config-volume\") pod \"collect-profiles-29551695-642fv\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.299809 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmrfk\" (UniqueName: \"kubernetes.io/projected/ec6570de-1422-4eaf-a42d-dc5f62f91eba-kube-api-access-lmrfk\") pod \"collect-profiles-29551695-642fv\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.401097 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec6570de-1422-4eaf-a42d-dc5f62f91eba-secret-volume\") pod \"collect-profiles-29551695-642fv\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.401244 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec6570de-1422-4eaf-a42d-dc5f62f91eba-config-volume\") pod \"collect-profiles-29551695-642fv\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.401348 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmrfk\" (UniqueName: \"kubernetes.io/projected/ec6570de-1422-4eaf-a42d-dc5f62f91eba-kube-api-access-lmrfk\") pod \"collect-profiles-29551695-642fv\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.402925 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec6570de-1422-4eaf-a42d-dc5f62f91eba-config-volume\") pod \"collect-profiles-29551695-642fv\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.415432 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec6570de-1422-4eaf-a42d-dc5f62f91eba-secret-volume\") pod \"collect-profiles-29551695-642fv\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.422515 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmrfk\" (UniqueName: \"kubernetes.io/projected/ec6570de-1422-4eaf-a42d-dc5f62f91eba-kube-api-access-lmrfk\") pod \"collect-profiles-29551695-642fv\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.501183 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:00 crc kubenswrapper[4994]: I0310 00:15:00.983471 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv"] Mar 10 00:15:01 crc kubenswrapper[4994]: I0310 00:15:01.201691 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" event={"ID":"ec6570de-1422-4eaf-a42d-dc5f62f91eba","Type":"ContainerStarted","Data":"3e3524bdde5e5142c068a121dd13533f73b2944f4efabb9885c4bafd02497ae3"} Mar 10 00:15:01 crc kubenswrapper[4994]: I0310 00:15:01.202071 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" event={"ID":"ec6570de-1422-4eaf-a42d-dc5f62f91eba","Type":"ContainerStarted","Data":"27075e8e90509a20d0eb37e286e210f6b710fd6b7a61129393ecec9f6ef0c11e"} Mar 10 00:15:02 crc kubenswrapper[4994]: I0310 00:15:02.210777 4994 generic.go:334] "Generic (PLEG): container finished" podID="ec6570de-1422-4eaf-a42d-dc5f62f91eba" containerID="3e3524bdde5e5142c068a121dd13533f73b2944f4efabb9885c4bafd02497ae3" exitCode=0 Mar 10 00:15:02 crc kubenswrapper[4994]: I0310 00:15:02.210834 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" event={"ID":"ec6570de-1422-4eaf-a42d-dc5f62f91eba","Type":"ContainerDied","Data":"3e3524bdde5e5142c068a121dd13533f73b2944f4efabb9885c4bafd02497ae3"} Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.588966 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.754162 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec6570de-1422-4eaf-a42d-dc5f62f91eba-secret-volume\") pod \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.754341 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmrfk\" (UniqueName: \"kubernetes.io/projected/ec6570de-1422-4eaf-a42d-dc5f62f91eba-kube-api-access-lmrfk\") pod \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.754422 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec6570de-1422-4eaf-a42d-dc5f62f91eba-config-volume\") pod \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\" (UID: \"ec6570de-1422-4eaf-a42d-dc5f62f91eba\") " Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.756440 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec6570de-1422-4eaf-a42d-dc5f62f91eba-config-volume" (OuterVolumeSpecName: "config-volume") pod "ec6570de-1422-4eaf-a42d-dc5f62f91eba" (UID: "ec6570de-1422-4eaf-a42d-dc5f62f91eba"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.764185 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec6570de-1422-4eaf-a42d-dc5f62f91eba-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ec6570de-1422-4eaf-a42d-dc5f62f91eba" (UID: "ec6570de-1422-4eaf-a42d-dc5f62f91eba"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.766102 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec6570de-1422-4eaf-a42d-dc5f62f91eba-kube-api-access-lmrfk" (OuterVolumeSpecName: "kube-api-access-lmrfk") pod "ec6570de-1422-4eaf-a42d-dc5f62f91eba" (UID: "ec6570de-1422-4eaf-a42d-dc5f62f91eba"). InnerVolumeSpecName "kube-api-access-lmrfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.856543 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmrfk\" (UniqueName: \"kubernetes.io/projected/ec6570de-1422-4eaf-a42d-dc5f62f91eba-kube-api-access-lmrfk\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.856592 4994 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec6570de-1422-4eaf-a42d-dc5f62f91eba-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:03 crc kubenswrapper[4994]: I0310 00:15:03.856612 4994 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec6570de-1422-4eaf-a42d-dc5f62f91eba-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:04 crc kubenswrapper[4994]: I0310 00:15:04.223435 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" event={"ID":"ec6570de-1422-4eaf-a42d-dc5f62f91eba","Type":"ContainerDied","Data":"27075e8e90509a20d0eb37e286e210f6b710fd6b7a61129393ecec9f6ef0c11e"} Mar 10 00:15:04 crc kubenswrapper[4994]: I0310 00:15:04.223513 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27075e8e90509a20d0eb37e286e210f6b710fd6b7a61129393ecec9f6ef0c11e" Mar 10 00:15:04 crc kubenswrapper[4994]: I0310 00:15:04.223536 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-642fv" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.142707 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9dnvg"] Mar 10 00:15:07 crc kubenswrapper[4994]: E0310 00:15:07.144057 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec6570de-1422-4eaf-a42d-dc5f62f91eba" containerName="collect-profiles" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.144121 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec6570de-1422-4eaf-a42d-dc5f62f91eba" containerName="collect-profiles" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.144482 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec6570de-1422-4eaf-a42d-dc5f62f91eba" containerName="collect-profiles" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.147046 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.150026 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.176727 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9dnvg"] Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.304584 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad4ae94a-f55f-4133-9b34-f95992f5454b-utilities\") pod \"redhat-operators-9dnvg\" (UID: \"ad4ae94a-f55f-4133-9b34-f95992f5454b\") " pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.305100 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s8cb\" (UniqueName: \"kubernetes.io/projected/ad4ae94a-f55f-4133-9b34-f95992f5454b-kube-api-access-2s8cb\") pod \"redhat-operators-9dnvg\" (UID: \"ad4ae94a-f55f-4133-9b34-f95992f5454b\") " pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.305422 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad4ae94a-f55f-4133-9b34-f95992f5454b-catalog-content\") pod \"redhat-operators-9dnvg\" (UID: \"ad4ae94a-f55f-4133-9b34-f95992f5454b\") " pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.331688 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4vf56"] Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.333671 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.337115 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.348578 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4vf56"] Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.407227 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad4ae94a-f55f-4133-9b34-f95992f5454b-catalog-content\") pod \"redhat-operators-9dnvg\" (UID: \"ad4ae94a-f55f-4133-9b34-f95992f5454b\") " pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.407335 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad4ae94a-f55f-4133-9b34-f95992f5454b-utilities\") pod \"redhat-operators-9dnvg\" (UID: \"ad4ae94a-f55f-4133-9b34-f95992f5454b\") " pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.407433 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s8cb\" (UniqueName: \"kubernetes.io/projected/ad4ae94a-f55f-4133-9b34-f95992f5454b-kube-api-access-2s8cb\") pod \"redhat-operators-9dnvg\" (UID: \"ad4ae94a-f55f-4133-9b34-f95992f5454b\") " pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.407933 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad4ae94a-f55f-4133-9b34-f95992f5454b-catalog-content\") pod \"redhat-operators-9dnvg\" (UID: \"ad4ae94a-f55f-4133-9b34-f95992f5454b\") " pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.408177 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad4ae94a-f55f-4133-9b34-f95992f5454b-utilities\") pod \"redhat-operators-9dnvg\" (UID: \"ad4ae94a-f55f-4133-9b34-f95992f5454b\") " pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.440227 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s8cb\" (UniqueName: \"kubernetes.io/projected/ad4ae94a-f55f-4133-9b34-f95992f5454b-kube-api-access-2s8cb\") pod \"redhat-operators-9dnvg\" (UID: \"ad4ae94a-f55f-4133-9b34-f95992f5454b\") " pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.508670 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flst2\" (UniqueName: \"kubernetes.io/projected/2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105-kube-api-access-flst2\") pod \"certified-operators-4vf56\" (UID: \"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105\") " pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.508774 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105-catalog-content\") pod \"certified-operators-4vf56\" (UID: \"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105\") " pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.509005 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105-utilities\") pod \"certified-operators-4vf56\" (UID: \"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105\") " pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.519519 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.610583 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flst2\" (UniqueName: \"kubernetes.io/projected/2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105-kube-api-access-flst2\") pod \"certified-operators-4vf56\" (UID: \"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105\") " pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.610660 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105-catalog-content\") pod \"certified-operators-4vf56\" (UID: \"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105\") " pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.610822 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105-utilities\") pod \"certified-operators-4vf56\" (UID: \"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105\") " pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.611793 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105-utilities\") pod \"certified-operators-4vf56\" (UID: \"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105\") " pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.611907 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105-catalog-content\") pod \"certified-operators-4vf56\" (UID: \"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105\") " pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.642446 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flst2\" (UniqueName: \"kubernetes.io/projected/2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105-kube-api-access-flst2\") pod \"certified-operators-4vf56\" (UID: \"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105\") " pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:07 crc kubenswrapper[4994]: I0310 00:15:07.659253 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:08 crc kubenswrapper[4994]: I0310 00:15:08.024971 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9dnvg"] Mar 10 00:15:08 crc kubenswrapper[4994]: W0310 00:15:08.032119 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad4ae94a_f55f_4133_9b34_f95992f5454b.slice/crio-3ae937bfe2b438c26596b4336eb3f80fb45022398ac81956b458fbde8a6c9a8f WatchSource:0}: Error finding container 3ae937bfe2b438c26596b4336eb3f80fb45022398ac81956b458fbde8a6c9a8f: Status 404 returned error can't find the container with id 3ae937bfe2b438c26596b4336eb3f80fb45022398ac81956b458fbde8a6c9a8f Mar 10 00:15:08 crc kubenswrapper[4994]: I0310 00:15:08.116590 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4vf56"] Mar 10 00:15:08 crc kubenswrapper[4994]: W0310 00:15:08.124676 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a4e3a5e_5559_4e0b_a9b5_f117c0dcf105.slice/crio-f2fad8890475296348f668388e57267764dee5e2656aabba90c5c677efb49295 WatchSource:0}: Error finding container f2fad8890475296348f668388e57267764dee5e2656aabba90c5c677efb49295: Status 404 returned error can't find the container with id f2fad8890475296348f668388e57267764dee5e2656aabba90c5c677efb49295 Mar 10 00:15:08 crc kubenswrapper[4994]: I0310 00:15:08.250507 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vf56" event={"ID":"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105","Type":"ContainerStarted","Data":"f2fad8890475296348f668388e57267764dee5e2656aabba90c5c677efb49295"} Mar 10 00:15:08 crc kubenswrapper[4994]: I0310 00:15:08.251945 4994 generic.go:334] "Generic (PLEG): container finished" podID="ad4ae94a-f55f-4133-9b34-f95992f5454b" containerID="2c1e9079a493a5f62735a8cbb8e8de927eea69492fd29b516317758d01c362df" exitCode=0 Mar 10 00:15:08 crc kubenswrapper[4994]: I0310 00:15:08.251982 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9dnvg" event={"ID":"ad4ae94a-f55f-4133-9b34-f95992f5454b","Type":"ContainerDied","Data":"2c1e9079a493a5f62735a8cbb8e8de927eea69492fd29b516317758d01c362df"} Mar 10 00:15:08 crc kubenswrapper[4994]: I0310 00:15:08.252003 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9dnvg" event={"ID":"ad4ae94a-f55f-4133-9b34-f95992f5454b","Type":"ContainerStarted","Data":"3ae937bfe2b438c26596b4336eb3f80fb45022398ac81956b458fbde8a6c9a8f"} Mar 10 00:15:08 crc kubenswrapper[4994]: I0310 00:15:08.253592 4994 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.257185 4994 generic.go:334] "Generic (PLEG): container finished" podID="2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105" containerID="f8b7a572ee09996595bda7b0a6810aa9c629ec20b78ffb9010fccfcdcbf17cf5" exitCode=0 Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.257447 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vf56" event={"ID":"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105","Type":"ContainerDied","Data":"f8b7a572ee09996595bda7b0a6810aa9c629ec20b78ffb9010fccfcdcbf17cf5"} Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.737392 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-45nlb"] Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.738928 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.741601 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.748296 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-45nlb"] Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.844708 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aaa4876-9545-4d43-b7a3-02d53c8ef8f5-catalog-content\") pod \"community-operators-45nlb\" (UID: \"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5\") " pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.844745 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aaa4876-9545-4d43-b7a3-02d53c8ef8f5-utilities\") pod \"community-operators-45nlb\" (UID: \"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5\") " pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.844811 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvmv9\" (UniqueName: \"kubernetes.io/projected/2aaa4876-9545-4d43-b7a3-02d53c8ef8f5-kube-api-access-hvmv9\") pod \"community-operators-45nlb\" (UID: \"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5\") " pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.926104 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2b884"] Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.927229 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.929958 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.945637 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aaa4876-9545-4d43-b7a3-02d53c8ef8f5-catalog-content\") pod \"community-operators-45nlb\" (UID: \"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5\") " pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.945746 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aaa4876-9545-4d43-b7a3-02d53c8ef8f5-utilities\") pod \"community-operators-45nlb\" (UID: \"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5\") " pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.946006 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvmv9\" (UniqueName: \"kubernetes.io/projected/2aaa4876-9545-4d43-b7a3-02d53c8ef8f5-kube-api-access-hvmv9\") pod \"community-operators-45nlb\" (UID: \"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5\") " pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.946278 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aaa4876-9545-4d43-b7a3-02d53c8ef8f5-catalog-content\") pod \"community-operators-45nlb\" (UID: \"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5\") " pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.946669 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aaa4876-9545-4d43-b7a3-02d53c8ef8f5-utilities\") pod \"community-operators-45nlb\" (UID: \"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5\") " pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.955972 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2b884"] Mar 10 00:15:09 crc kubenswrapper[4994]: I0310 00:15:09.974996 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvmv9\" (UniqueName: \"kubernetes.io/projected/2aaa4876-9545-4d43-b7a3-02d53c8ef8f5-kube-api-access-hvmv9\") pod \"community-operators-45nlb\" (UID: \"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5\") " pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.047045 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqxvd\" (UniqueName: \"kubernetes.io/projected/33f42081-c92d-42cd-90b5-329a5ae6c2ad-kube-api-access-cqxvd\") pod \"redhat-marketplace-2b884\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.047439 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-utilities\") pod \"redhat-marketplace-2b884\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.047493 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-catalog-content\") pod \"redhat-marketplace-2b884\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.059955 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.149053 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-utilities\") pod \"redhat-marketplace-2b884\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.149137 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-catalog-content\") pod \"redhat-marketplace-2b884\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.149219 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqxvd\" (UniqueName: \"kubernetes.io/projected/33f42081-c92d-42cd-90b5-329a5ae6c2ad-kube-api-access-cqxvd\") pod \"redhat-marketplace-2b884\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.150280 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-utilities\") pod \"redhat-marketplace-2b884\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.150653 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-catalog-content\") pod \"redhat-marketplace-2b884\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.176427 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqxvd\" (UniqueName: \"kubernetes.io/projected/33f42081-c92d-42cd-90b5-329a5ae6c2ad-kube-api-access-cqxvd\") pod \"redhat-marketplace-2b884\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.241108 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.266866 4994 generic.go:334] "Generic (PLEG): container finished" podID="ad4ae94a-f55f-4133-9b34-f95992f5454b" containerID="2f057d8119fc87624569ba8ca8ad6e525c8575e2620f5f49190927e9fe1fcdbc" exitCode=0 Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.266927 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9dnvg" event={"ID":"ad4ae94a-f55f-4133-9b34-f95992f5454b","Type":"ContainerDied","Data":"2f057d8119fc87624569ba8ca8ad6e525c8575e2620f5f49190927e9fe1fcdbc"} Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.318717 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-45nlb"] Mar 10 00:15:10 crc kubenswrapper[4994]: I0310 00:15:10.715101 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2b884"] Mar 10 00:15:10 crc kubenswrapper[4994]: W0310 00:15:10.724799 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33f42081_c92d_42cd_90b5_329a5ae6c2ad.slice/crio-2092117d1a588bddb0afa93402f75947e71f54b402d21d698eb1a68abf48a5d8 WatchSource:0}: Error finding container 2092117d1a588bddb0afa93402f75947e71f54b402d21d698eb1a68abf48a5d8: Status 404 returned error can't find the container with id 2092117d1a588bddb0afa93402f75947e71f54b402d21d698eb1a68abf48a5d8 Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.276850 4994 generic.go:334] "Generic (PLEG): container finished" podID="2aaa4876-9545-4d43-b7a3-02d53c8ef8f5" containerID="7ab894a2389f7d1bd80a6804cda0404c67ca01fc2a83e61e70b76f10cd2491ba" exitCode=0 Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.276990 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45nlb" event={"ID":"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5","Type":"ContainerDied","Data":"7ab894a2389f7d1bd80a6804cda0404c67ca01fc2a83e61e70b76f10cd2491ba"} Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.277029 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45nlb" event={"ID":"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5","Type":"ContainerStarted","Data":"b7250913fcf3d29c66ad62790938642d67f029fbfbe79948b71977cbe1ef12f3"} Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.280444 4994 generic.go:334] "Generic (PLEG): container finished" podID="2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105" containerID="78cce51b9958b4131f8c656983a4b40ced8f6b40d33f51d1467669721bcb10ea" exitCode=0 Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.280623 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vf56" event={"ID":"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105","Type":"ContainerDied","Data":"78cce51b9958b4131f8c656983a4b40ced8f6b40d33f51d1467669721bcb10ea"} Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.284370 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9dnvg" event={"ID":"ad4ae94a-f55f-4133-9b34-f95992f5454b","Type":"ContainerStarted","Data":"0754e3199dca8df9639ade2c3f4f9de8b339d050a381cf423dc4366a663f8b81"} Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.286150 4994 generic.go:334] "Generic (PLEG): container finished" podID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerID="9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b" exitCode=0 Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.286183 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2b884" event={"ID":"33f42081-c92d-42cd-90b5-329a5ae6c2ad","Type":"ContainerDied","Data":"9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b"} Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.286208 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2b884" event={"ID":"33f42081-c92d-42cd-90b5-329a5ae6c2ad","Type":"ContainerStarted","Data":"2092117d1a588bddb0afa93402f75947e71f54b402d21d698eb1a68abf48a5d8"} Mar 10 00:15:11 crc kubenswrapper[4994]: I0310 00:15:11.372891 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9dnvg" podStartSLOduration=1.830604906 podStartE2EDuration="4.372814849s" podCreationTimestamp="2026-03-10 00:15:07 +0000 UTC" firstStartedPulling="2026-03-10 00:15:08.253321745 +0000 UTC m=+522.427028504" lastFinishedPulling="2026-03-10 00:15:10.795531668 +0000 UTC m=+524.969238447" observedRunningTime="2026-03-10 00:15:11.367711655 +0000 UTC m=+525.541418444" watchObservedRunningTime="2026-03-10 00:15:11.372814849 +0000 UTC m=+525.546521638" Mar 10 00:15:12 crc kubenswrapper[4994]: I0310 00:15:12.300150 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45nlb" event={"ID":"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5","Type":"ContainerStarted","Data":"388b1069a6c755e8c923e733a205c8312af5821f43f85d06f4a0ca0e4bdd6f96"} Mar 10 00:15:12 crc kubenswrapper[4994]: I0310 00:15:12.305986 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vf56" event={"ID":"2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105","Type":"ContainerStarted","Data":"9dd79697a8e5687f54bb0cc307dbf3e35bf4e0e64c6be831299048032444a299"} Mar 10 00:15:12 crc kubenswrapper[4994]: I0310 00:15:12.308978 4994 generic.go:334] "Generic (PLEG): container finished" podID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerID="d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a" exitCode=0 Mar 10 00:15:12 crc kubenswrapper[4994]: I0310 00:15:12.309841 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2b884" event={"ID":"33f42081-c92d-42cd-90b5-329a5ae6c2ad","Type":"ContainerDied","Data":"d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a"} Mar 10 00:15:12 crc kubenswrapper[4994]: I0310 00:15:12.343803 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4vf56" podStartSLOduration=2.677153419 podStartE2EDuration="5.343783503s" podCreationTimestamp="2026-03-10 00:15:07 +0000 UTC" firstStartedPulling="2026-03-10 00:15:09.258431387 +0000 UTC m=+523.432138136" lastFinishedPulling="2026-03-10 00:15:11.925061441 +0000 UTC m=+526.098768220" observedRunningTime="2026-03-10 00:15:12.342003016 +0000 UTC m=+526.515709775" watchObservedRunningTime="2026-03-10 00:15:12.343783503 +0000 UTC m=+526.517490272" Mar 10 00:15:12 crc kubenswrapper[4994]: I0310 00:15:12.956776 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-r27vp" Mar 10 00:15:13 crc kubenswrapper[4994]: I0310 00:15:13.028118 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-75h8c"] Mar 10 00:15:13 crc kubenswrapper[4994]: I0310 00:15:13.316401 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2b884" event={"ID":"33f42081-c92d-42cd-90b5-329a5ae6c2ad","Type":"ContainerStarted","Data":"025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138"} Mar 10 00:15:13 crc kubenswrapper[4994]: I0310 00:15:13.317778 4994 generic.go:334] "Generic (PLEG): container finished" podID="2aaa4876-9545-4d43-b7a3-02d53c8ef8f5" containerID="388b1069a6c755e8c923e733a205c8312af5821f43f85d06f4a0ca0e4bdd6f96" exitCode=0 Mar 10 00:15:13 crc kubenswrapper[4994]: I0310 00:15:13.318457 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45nlb" event={"ID":"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5","Type":"ContainerDied","Data":"388b1069a6c755e8c923e733a205c8312af5821f43f85d06f4a0ca0e4bdd6f96"} Mar 10 00:15:13 crc kubenswrapper[4994]: I0310 00:15:13.337695 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2b884" podStartSLOduration=2.920161923 podStartE2EDuration="4.33767977s" podCreationTimestamp="2026-03-10 00:15:09 +0000 UTC" firstStartedPulling="2026-03-10 00:15:11.287811244 +0000 UTC m=+525.461518033" lastFinishedPulling="2026-03-10 00:15:12.705329131 +0000 UTC m=+526.879035880" observedRunningTime="2026-03-10 00:15:13.33656292 +0000 UTC m=+527.510269679" watchObservedRunningTime="2026-03-10 00:15:13.33767977 +0000 UTC m=+527.511386519" Mar 10 00:15:14 crc kubenswrapper[4994]: I0310 00:15:14.326422 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45nlb" event={"ID":"2aaa4876-9545-4d43-b7a3-02d53c8ef8f5","Type":"ContainerStarted","Data":"a90c33c3ace798b84ccc4b92b44a44778e9d8f90bb203eb635ac96337b56d324"} Mar 10 00:15:14 crc kubenswrapper[4994]: I0310 00:15:14.349959 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-45nlb" podStartSLOduration=2.741832433 podStartE2EDuration="5.349933349s" podCreationTimestamp="2026-03-10 00:15:09 +0000 UTC" firstStartedPulling="2026-03-10 00:15:11.27969793 +0000 UTC m=+525.453404689" lastFinishedPulling="2026-03-10 00:15:13.887798816 +0000 UTC m=+528.061505605" observedRunningTime="2026-03-10 00:15:14.344719872 +0000 UTC m=+528.518426631" watchObservedRunningTime="2026-03-10 00:15:14.349933349 +0000 UTC m=+528.523640108" Mar 10 00:15:17 crc kubenswrapper[4994]: I0310 00:15:17.520511 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:17 crc kubenswrapper[4994]: I0310 00:15:17.520947 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:17 crc kubenswrapper[4994]: I0310 00:15:17.661148 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:17 crc kubenswrapper[4994]: I0310 00:15:17.661214 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:17 crc kubenswrapper[4994]: I0310 00:15:17.708229 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:18 crc kubenswrapper[4994]: I0310 00:15:18.416777 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4vf56" Mar 10 00:15:18 crc kubenswrapper[4994]: I0310 00:15:18.599407 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9dnvg" podUID="ad4ae94a-f55f-4133-9b34-f95992f5454b" containerName="registry-server" probeResult="failure" output=< Mar 10 00:15:18 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:15:18 crc kubenswrapper[4994]: > Mar 10 00:15:20 crc kubenswrapper[4994]: I0310 00:15:20.062217 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:20 crc kubenswrapper[4994]: I0310 00:15:20.062302 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:20 crc kubenswrapper[4994]: I0310 00:15:20.107464 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:20 crc kubenswrapper[4994]: I0310 00:15:20.241692 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:20 crc kubenswrapper[4994]: I0310 00:15:20.241739 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:20 crc kubenswrapper[4994]: I0310 00:15:20.315811 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:20 crc kubenswrapper[4994]: I0310 00:15:20.445730 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-45nlb" Mar 10 00:15:20 crc kubenswrapper[4994]: I0310 00:15:20.446452 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:15:21 crc kubenswrapper[4994]: I0310 00:15:21.066707 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4"] Mar 10 00:15:21 crc kubenswrapper[4994]: I0310 00:15:21.067021 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" podUID="d7d43a41-1177-47b1-ac5f-3d4309491587" containerName="route-controller-manager" containerID="cri-o://92d6fcd0f4f7fb4ba9168b84d9692bf78d32d2b1f295642d2ecc92e5c84e1694" gracePeriod=30 Mar 10 00:15:22 crc kubenswrapper[4994]: I0310 00:15:22.394922 4994 generic.go:334] "Generic (PLEG): container finished" podID="d7d43a41-1177-47b1-ac5f-3d4309491587" containerID="92d6fcd0f4f7fb4ba9168b84d9692bf78d32d2b1f295642d2ecc92e5c84e1694" exitCode=0 Mar 10 00:15:22 crc kubenswrapper[4994]: I0310 00:15:22.395194 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" event={"ID":"d7d43a41-1177-47b1-ac5f-3d4309491587","Type":"ContainerDied","Data":"92d6fcd0f4f7fb4ba9168b84d9692bf78d32d2b1f295642d2ecc92e5c84e1694"} Mar 10 00:15:22 crc kubenswrapper[4994]: I0310 00:15:22.971900 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.014637 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9"] Mar 10 00:15:23 crc kubenswrapper[4994]: E0310 00:15:23.015186 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d43a41-1177-47b1-ac5f-3d4309491587" containerName="route-controller-manager" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.015208 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d43a41-1177-47b1-ac5f-3d4309491587" containerName="route-controller-manager" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.015331 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d43a41-1177-47b1-ac5f-3d4309491587" containerName="route-controller-manager" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.015774 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.025295 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9"] Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.053801 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-client-ca\") pod \"d7d43a41-1177-47b1-ac5f-3d4309491587\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.053866 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-config\") pod \"d7d43a41-1177-47b1-ac5f-3d4309491587\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.054026 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d43a41-1177-47b1-ac5f-3d4309491587-serving-cert\") pod \"d7d43a41-1177-47b1-ac5f-3d4309491587\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.054075 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tdtw\" (UniqueName: \"kubernetes.io/projected/d7d43a41-1177-47b1-ac5f-3d4309491587-kube-api-access-4tdtw\") pod \"d7d43a41-1177-47b1-ac5f-3d4309491587\" (UID: \"d7d43a41-1177-47b1-ac5f-3d4309491587\") " Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.054922 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-config" (OuterVolumeSpecName: "config") pod "d7d43a41-1177-47b1-ac5f-3d4309491587" (UID: "d7d43a41-1177-47b1-ac5f-3d4309491587"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.055574 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-client-ca" (OuterVolumeSpecName: "client-ca") pod "d7d43a41-1177-47b1-ac5f-3d4309491587" (UID: "d7d43a41-1177-47b1-ac5f-3d4309491587"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.064117 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7d43a41-1177-47b1-ac5f-3d4309491587-kube-api-access-4tdtw" (OuterVolumeSpecName: "kube-api-access-4tdtw") pod "d7d43a41-1177-47b1-ac5f-3d4309491587" (UID: "d7d43a41-1177-47b1-ac5f-3d4309491587"). InnerVolumeSpecName "kube-api-access-4tdtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.064135 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d43a41-1177-47b1-ac5f-3d4309491587-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d7d43a41-1177-47b1-ac5f-3d4309491587" (UID: "d7d43a41-1177-47b1-ac5f-3d4309491587"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.156088 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-config\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.156201 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxczt\" (UniqueName: \"kubernetes.io/projected/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-kube-api-access-qxczt\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.156258 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-client-ca\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.156387 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-serving-cert\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.156570 4994 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.156619 4994 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d43a41-1177-47b1-ac5f-3d4309491587-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.156652 4994 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d43a41-1177-47b1-ac5f-3d4309491587-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.156676 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tdtw\" (UniqueName: \"kubernetes.io/projected/d7d43a41-1177-47b1-ac5f-3d4309491587-kube-api-access-4tdtw\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.258532 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-config\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.258601 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxczt\" (UniqueName: \"kubernetes.io/projected/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-kube-api-access-qxczt\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.258639 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-client-ca\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.258676 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-serving-cert\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.260012 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-client-ca\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.260669 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-config\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.265034 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-serving-cert\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.289268 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxczt\" (UniqueName: \"kubernetes.io/projected/8e2a94a6-6f1b-4a7e-a298-3f4c517754bc-kube-api-access-qxczt\") pod \"route-controller-manager-8d84b6f7f-vwrp9\" (UID: \"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc\") " pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.349025 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.406225 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" event={"ID":"d7d43a41-1177-47b1-ac5f-3d4309491587","Type":"ContainerDied","Data":"bf8f57540b2c9250bb5294616cdfe2a18d71874683f0e928b618b3a35928461f"} Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.406307 4994 scope.go:117] "RemoveContainer" containerID="92d6fcd0f4f7fb4ba9168b84d9692bf78d32d2b1f295642d2ecc92e5c84e1694" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.406325 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.456033 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4"] Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.460349 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4"] Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.651970 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9"] Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.892606 4994 patch_prober.go:28] interesting pod/route-controller-manager-84db66d99d-vvln4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:15:23 crc kubenswrapper[4994]: I0310 00:15:23.892709 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-84db66d99d-vvln4" podUID="d7d43a41-1177-47b1-ac5f-3d4309491587" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:15:24 crc kubenswrapper[4994]: I0310 00:15:24.415962 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" event={"ID":"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc","Type":"ContainerStarted","Data":"ef076e6b86a043fc0a26be4822c4501f0022e6ae7d4e4ffb20ac283bed8aba1f"} Mar 10 00:15:24 crc kubenswrapper[4994]: I0310 00:15:24.416032 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" event={"ID":"8e2a94a6-6f1b-4a7e-a298-3f4c517754bc","Type":"ContainerStarted","Data":"812abc6e08175598e7ba80f628f3ccd1ad9da1302e824873176e499249c2fd09"} Mar 10 00:15:24 crc kubenswrapper[4994]: I0310 00:15:24.416259 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:24 crc kubenswrapper[4994]: I0310 00:15:24.448750 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" podStartSLOduration=3.44872114 podStartE2EDuration="3.44872114s" podCreationTimestamp="2026-03-10 00:15:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:15:24.443244036 +0000 UTC m=+538.616950825" watchObservedRunningTime="2026-03-10 00:15:24.44872114 +0000 UTC m=+538.622427919" Mar 10 00:15:24 crc kubenswrapper[4994]: I0310 00:15:24.497124 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8d84b6f7f-vwrp9" Mar 10 00:15:24 crc kubenswrapper[4994]: I0310 00:15:24.585190 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7d43a41-1177-47b1-ac5f-3d4309491587" path="/var/lib/kubelet/pods/d7d43a41-1177-47b1-ac5f-3d4309491587/volumes" Mar 10 00:15:27 crc kubenswrapper[4994]: I0310 00:15:27.590045 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:27 crc kubenswrapper[4994]: I0310 00:15:27.698815 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9dnvg" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.077144 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" podUID="295cba62-fd24-4245-8773-866ee134a29e" containerName="registry" containerID="cri-o://0b99026327a0246e8d6a6998d063d7da1dc8dded77c229f16aea5a63dc4137ba" gracePeriod=30 Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.514230 4994 generic.go:334] "Generic (PLEG): container finished" podID="295cba62-fd24-4245-8773-866ee134a29e" containerID="0b99026327a0246e8d6a6998d063d7da1dc8dded77c229f16aea5a63dc4137ba" exitCode=0 Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.514382 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" event={"ID":"295cba62-fd24-4245-8773-866ee134a29e","Type":"ContainerDied","Data":"0b99026327a0246e8d6a6998d063d7da1dc8dded77c229f16aea5a63dc4137ba"} Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.514723 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" event={"ID":"295cba62-fd24-4245-8773-866ee134a29e","Type":"ContainerDied","Data":"dd7db90166bf3d060ca8294e206e48fea14498e48c7b2ef7fe0c1e7d9d4dd09f"} Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.514746 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd7db90166bf3d060ca8294e206e48fea14498e48c7b2ef7fe0c1e7d9d4dd09f" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.542689 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.657644 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-bound-sa-token\") pod \"295cba62-fd24-4245-8773-866ee134a29e\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.657699 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/295cba62-fd24-4245-8773-866ee134a29e-installation-pull-secrets\") pod \"295cba62-fd24-4245-8773-866ee134a29e\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.657736 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-registry-tls\") pod \"295cba62-fd24-4245-8773-866ee134a29e\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.657770 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbrkc\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-kube-api-access-kbrkc\") pod \"295cba62-fd24-4245-8773-866ee134a29e\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.657824 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/295cba62-fd24-4245-8773-866ee134a29e-ca-trust-extracted\") pod \"295cba62-fd24-4245-8773-866ee134a29e\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.658798 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-trusted-ca\") pod \"295cba62-fd24-4245-8773-866ee134a29e\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.658829 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-registry-certificates\") pod \"295cba62-fd24-4245-8773-866ee134a29e\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.658969 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"295cba62-fd24-4245-8773-866ee134a29e\" (UID: \"295cba62-fd24-4245-8773-866ee134a29e\") " Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.659304 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "295cba62-fd24-4245-8773-866ee134a29e" (UID: "295cba62-fd24-4245-8773-866ee134a29e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.659393 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "295cba62-fd24-4245-8773-866ee134a29e" (UID: "295cba62-fd24-4245-8773-866ee134a29e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.661597 4994 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.661618 4994 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/295cba62-fd24-4245-8773-866ee134a29e-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.664136 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "295cba62-fd24-4245-8773-866ee134a29e" (UID: "295cba62-fd24-4245-8773-866ee134a29e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.664842 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "295cba62-fd24-4245-8773-866ee134a29e" (UID: "295cba62-fd24-4245-8773-866ee134a29e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.668035 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295cba62-fd24-4245-8773-866ee134a29e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "295cba62-fd24-4245-8773-866ee134a29e" (UID: "295cba62-fd24-4245-8773-866ee134a29e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.668383 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-kube-api-access-kbrkc" (OuterVolumeSpecName: "kube-api-access-kbrkc") pod "295cba62-fd24-4245-8773-866ee134a29e" (UID: "295cba62-fd24-4245-8773-866ee134a29e"). InnerVolumeSpecName "kube-api-access-kbrkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.672430 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "295cba62-fd24-4245-8773-866ee134a29e" (UID: "295cba62-fd24-4245-8773-866ee134a29e"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.674211 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/295cba62-fd24-4245-8773-866ee134a29e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "295cba62-fd24-4245-8773-866ee134a29e" (UID: "295cba62-fd24-4245-8773-866ee134a29e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.762968 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbrkc\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-kube-api-access-kbrkc\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.763025 4994 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/295cba62-fd24-4245-8773-866ee134a29e-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.763039 4994 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.763051 4994 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/295cba62-fd24-4245-8773-866ee134a29e-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:38 crc kubenswrapper[4994]: I0310 00:15:38.763065 4994 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/295cba62-fd24-4245-8773-866ee134a29e-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:39 crc kubenswrapper[4994]: I0310 00:15:39.521493 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-75h8c" Mar 10 00:15:39 crc kubenswrapper[4994]: I0310 00:15:39.577066 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-75h8c"] Mar 10 00:15:39 crc kubenswrapper[4994]: I0310 00:15:39.584190 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-75h8c"] Mar 10 00:15:40 crc kubenswrapper[4994]: I0310 00:15:40.565503 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="295cba62-fd24-4245-8773-866ee134a29e" path="/var/lib/kubelet/pods/295cba62-fd24-4245-8773-866ee134a29e/volumes" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.149775 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551696-w9ztg"] Mar 10 00:16:00 crc kubenswrapper[4994]: E0310 00:16:00.150791 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295cba62-fd24-4245-8773-866ee134a29e" containerName="registry" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.150816 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="295cba62-fd24-4245-8773-866ee134a29e" containerName="registry" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.151029 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="295cba62-fd24-4245-8773-866ee134a29e" containerName="registry" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.151630 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551696-w9ztg" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.154613 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.155421 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.156191 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.164671 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551696-w9ztg"] Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.242544 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtb5b\" (UniqueName: \"kubernetes.io/projected/f25bd204-3572-4880-b74f-764a5a3e0123-kube-api-access-mtb5b\") pod \"auto-csr-approver-29551696-w9ztg\" (UID: \"f25bd204-3572-4880-b74f-764a5a3e0123\") " pod="openshift-infra/auto-csr-approver-29551696-w9ztg" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.344295 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtb5b\" (UniqueName: \"kubernetes.io/projected/f25bd204-3572-4880-b74f-764a5a3e0123-kube-api-access-mtb5b\") pod \"auto-csr-approver-29551696-w9ztg\" (UID: \"f25bd204-3572-4880-b74f-764a5a3e0123\") " pod="openshift-infra/auto-csr-approver-29551696-w9ztg" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.379403 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtb5b\" (UniqueName: \"kubernetes.io/projected/f25bd204-3572-4880-b74f-764a5a3e0123-kube-api-access-mtb5b\") pod \"auto-csr-approver-29551696-w9ztg\" (UID: \"f25bd204-3572-4880-b74f-764a5a3e0123\") " pod="openshift-infra/auto-csr-approver-29551696-w9ztg" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.482171 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551696-w9ztg" Mar 10 00:16:00 crc kubenswrapper[4994]: I0310 00:16:00.973814 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551696-w9ztg"] Mar 10 00:16:01 crc kubenswrapper[4994]: I0310 00:16:01.676259 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551696-w9ztg" event={"ID":"f25bd204-3572-4880-b74f-764a5a3e0123","Type":"ContainerStarted","Data":"cc4874c70c7b37039156aae07da17b94f31ae1faae8e17c8685d374d2debf7f4"} Mar 10 00:16:02 crc kubenswrapper[4994]: I0310 00:16:02.685053 4994 generic.go:334] "Generic (PLEG): container finished" podID="f25bd204-3572-4880-b74f-764a5a3e0123" containerID="60b85c7f8cd24fb6dc7f7bf060fdb2ddff2c7fdefc6188b8ccedeba460b2b511" exitCode=0 Mar 10 00:16:02 crc kubenswrapper[4994]: I0310 00:16:02.685147 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551696-w9ztg" event={"ID":"f25bd204-3572-4880-b74f-764a5a3e0123","Type":"ContainerDied","Data":"60b85c7f8cd24fb6dc7f7bf060fdb2ddff2c7fdefc6188b8ccedeba460b2b511"} Mar 10 00:16:04 crc kubenswrapper[4994]: I0310 00:16:04.098669 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551696-w9ztg" Mar 10 00:16:04 crc kubenswrapper[4994]: I0310 00:16:04.197024 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtb5b\" (UniqueName: \"kubernetes.io/projected/f25bd204-3572-4880-b74f-764a5a3e0123-kube-api-access-mtb5b\") pod \"f25bd204-3572-4880-b74f-764a5a3e0123\" (UID: \"f25bd204-3572-4880-b74f-764a5a3e0123\") " Mar 10 00:16:04 crc kubenswrapper[4994]: I0310 00:16:04.204043 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f25bd204-3572-4880-b74f-764a5a3e0123-kube-api-access-mtb5b" (OuterVolumeSpecName: "kube-api-access-mtb5b") pod "f25bd204-3572-4880-b74f-764a5a3e0123" (UID: "f25bd204-3572-4880-b74f-764a5a3e0123"). InnerVolumeSpecName "kube-api-access-mtb5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:16:04 crc kubenswrapper[4994]: I0310 00:16:04.298663 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtb5b\" (UniqueName: \"kubernetes.io/projected/f25bd204-3572-4880-b74f-764a5a3e0123-kube-api-access-mtb5b\") on node \"crc\" DevicePath \"\"" Mar 10 00:16:04 crc kubenswrapper[4994]: I0310 00:16:04.702809 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551696-w9ztg" event={"ID":"f25bd204-3572-4880-b74f-764a5a3e0123","Type":"ContainerDied","Data":"cc4874c70c7b37039156aae07da17b94f31ae1faae8e17c8685d374d2debf7f4"} Mar 10 00:16:04 crc kubenswrapper[4994]: I0310 00:16:04.703169 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc4874c70c7b37039156aae07da17b94f31ae1faae8e17c8685d374d2debf7f4" Mar 10 00:16:04 crc kubenswrapper[4994]: I0310 00:16:04.702912 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551696-w9ztg" Mar 10 00:16:05 crc kubenswrapper[4994]: I0310 00:16:05.167255 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551690-7rbl8"] Mar 10 00:16:05 crc kubenswrapper[4994]: I0310 00:16:05.171913 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551690-7rbl8"] Mar 10 00:16:06 crc kubenswrapper[4994]: I0310 00:16:06.562588 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f04aae5d-b067-4e49-82f3-66412ec1bba6" path="/var/lib/kubelet/pods/f04aae5d-b067-4e49-82f3-66412ec1bba6/volumes" Mar 10 00:16:11 crc kubenswrapper[4994]: I0310 00:16:11.534391 4994 scope.go:117] "RemoveContainer" containerID="669d56e78759519de5a6dd239e9cf24e944424e3eb64de05a26d842e32401407" Mar 10 00:16:11 crc kubenswrapper[4994]: I0310 00:16:11.583692 4994 scope.go:117] "RemoveContainer" containerID="ee78e5054ad5ad8a035342e7024985079f992d0c77022319d8f1e7f3d55f9eb9" Mar 10 00:17:11 crc kubenswrapper[4994]: I0310 00:17:11.697666 4994 scope.go:117] "RemoveContainer" containerID="0b99026327a0246e8d6a6998d063d7da1dc8dded77c229f16aea5a63dc4137ba" Mar 10 00:17:18 crc kubenswrapper[4994]: I0310 00:17:18.893127 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:17:18 crc kubenswrapper[4994]: I0310 00:17:18.893510 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:17:48 crc kubenswrapper[4994]: I0310 00:17:48.893300 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:17:48 crc kubenswrapper[4994]: I0310 00:17:48.893999 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.146779 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551698-2d6g5"] Mar 10 00:18:00 crc kubenswrapper[4994]: E0310 00:18:00.147863 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25bd204-3572-4880-b74f-764a5a3e0123" containerName="oc" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.147923 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25bd204-3572-4880-b74f-764a5a3e0123" containerName="oc" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.148112 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25bd204-3572-4880-b74f-764a5a3e0123" containerName="oc" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.148750 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551698-2d6g5" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.152226 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.154372 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.156695 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551698-2d6g5"] Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.193779 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.246993 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnq9s\" (UniqueName: \"kubernetes.io/projected/6471fd89-1c92-498d-ba15-149418259c58-kube-api-access-hnq9s\") pod \"auto-csr-approver-29551698-2d6g5\" (UID: \"6471fd89-1c92-498d-ba15-149418259c58\") " pod="openshift-infra/auto-csr-approver-29551698-2d6g5" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.348648 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnq9s\" (UniqueName: \"kubernetes.io/projected/6471fd89-1c92-498d-ba15-149418259c58-kube-api-access-hnq9s\") pod \"auto-csr-approver-29551698-2d6g5\" (UID: \"6471fd89-1c92-498d-ba15-149418259c58\") " pod="openshift-infra/auto-csr-approver-29551698-2d6g5" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.382932 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnq9s\" (UniqueName: \"kubernetes.io/projected/6471fd89-1c92-498d-ba15-149418259c58-kube-api-access-hnq9s\") pod \"auto-csr-approver-29551698-2d6g5\" (UID: \"6471fd89-1c92-498d-ba15-149418259c58\") " pod="openshift-infra/auto-csr-approver-29551698-2d6g5" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.520405 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551698-2d6g5" Mar 10 00:18:00 crc kubenswrapper[4994]: I0310 00:18:00.757411 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551698-2d6g5"] Mar 10 00:18:01 crc kubenswrapper[4994]: I0310 00:18:01.213061 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551698-2d6g5" event={"ID":"6471fd89-1c92-498d-ba15-149418259c58","Type":"ContainerStarted","Data":"4a8734c3105aac95c1dec5df57a7a79d1a3b6f1e13fd44aae694240c430ec8c2"} Mar 10 00:18:03 crc kubenswrapper[4994]: I0310 00:18:03.245110 4994 generic.go:334] "Generic (PLEG): container finished" podID="6471fd89-1c92-498d-ba15-149418259c58" containerID="b02eba80da92e156810a460cd2fd2b2fbae8ce74141ff71d34b4fc6b8bc7db3f" exitCode=0 Mar 10 00:18:03 crc kubenswrapper[4994]: I0310 00:18:03.245172 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551698-2d6g5" event={"ID":"6471fd89-1c92-498d-ba15-149418259c58","Type":"ContainerDied","Data":"b02eba80da92e156810a460cd2fd2b2fbae8ce74141ff71d34b4fc6b8bc7db3f"} Mar 10 00:18:04 crc kubenswrapper[4994]: I0310 00:18:04.570851 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551698-2d6g5" Mar 10 00:18:04 crc kubenswrapper[4994]: I0310 00:18:04.704575 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnq9s\" (UniqueName: \"kubernetes.io/projected/6471fd89-1c92-498d-ba15-149418259c58-kube-api-access-hnq9s\") pod \"6471fd89-1c92-498d-ba15-149418259c58\" (UID: \"6471fd89-1c92-498d-ba15-149418259c58\") " Mar 10 00:18:04 crc kubenswrapper[4994]: I0310 00:18:04.719950 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6471fd89-1c92-498d-ba15-149418259c58-kube-api-access-hnq9s" (OuterVolumeSpecName: "kube-api-access-hnq9s") pod "6471fd89-1c92-498d-ba15-149418259c58" (UID: "6471fd89-1c92-498d-ba15-149418259c58"). InnerVolumeSpecName "kube-api-access-hnq9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:18:04 crc kubenswrapper[4994]: I0310 00:18:04.806713 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnq9s\" (UniqueName: \"kubernetes.io/projected/6471fd89-1c92-498d-ba15-149418259c58-kube-api-access-hnq9s\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:05 crc kubenswrapper[4994]: I0310 00:18:05.260691 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551698-2d6g5" event={"ID":"6471fd89-1c92-498d-ba15-149418259c58","Type":"ContainerDied","Data":"4a8734c3105aac95c1dec5df57a7a79d1a3b6f1e13fd44aae694240c430ec8c2"} Mar 10 00:18:05 crc kubenswrapper[4994]: I0310 00:18:05.260751 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a8734c3105aac95c1dec5df57a7a79d1a3b6f1e13fd44aae694240c430ec8c2" Mar 10 00:18:05 crc kubenswrapper[4994]: I0310 00:18:05.260771 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551698-2d6g5" Mar 10 00:18:05 crc kubenswrapper[4994]: I0310 00:18:05.645427 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551692-29hls"] Mar 10 00:18:05 crc kubenswrapper[4994]: I0310 00:18:05.652581 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551692-29hls"] Mar 10 00:18:06 crc kubenswrapper[4994]: I0310 00:18:06.562481 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24a70a0f-0e78-4f55-9eee-62099acf734d" path="/var/lib/kubelet/pods/24a70a0f-0e78-4f55-9eee-62099acf734d/volumes" Mar 10 00:18:11 crc kubenswrapper[4994]: I0310 00:18:11.750182 4994 scope.go:117] "RemoveContainer" containerID="f159c5160bbebc2d55cd9bb33ea390e800dbff7ea9620c436631139ca88c6b3b" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.490015 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ns797"] Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.491249 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovn-controller" containerID="cri-o://ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd" gracePeriod=30 Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.491306 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="nbdb" containerID="cri-o://80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c" gracePeriod=30 Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.491404 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kube-rbac-proxy-node" containerID="cri-o://f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b" gracePeriod=30 Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.491491 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovn-acl-logging" containerID="cri-o://9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700" gracePeriod=30 Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.491509 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="sbdb" containerID="cri-o://9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1" gracePeriod=30 Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.491428 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85" gracePeriod=30 Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.491428 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="northd" containerID="cri-o://d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f" gracePeriod=30 Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.543227 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" containerID="cri-o://69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9" gracePeriod=30 Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.838307 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/3.log" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.842084 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovn-acl-logging/0.log" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.843659 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovn-controller/0.log" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.844409 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.899962 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.900158 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.901148 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.903528 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a58b30808ba3fd3b4a9259ace5f595c7d1bd5910098d24eb4a7ede149499cfa"} pod="openshift-machine-config-operator/machine-config-daemon-kfljj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.903676 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" containerID="cri-o://3a58b30808ba3fd3b4a9259ace5f595c7d1bd5910098d24eb4a7ede149499cfa" gracePeriod=600 Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918227 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918110 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-var-lib-openvswitch\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918396 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918524 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-bin\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918600 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-etc-openvswitch\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918644 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-slash\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918678 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-systemd-units\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918728 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s42gc\" (UniqueName: \"kubernetes.io/projected/72a13a81-4c11-4529-8a3d-2dd3c73215a7-kube-api-access-s42gc\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918765 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-env-overrides\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918799 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-netns\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918837 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-ovn-kubernetes\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918865 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-ovn\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918920 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovn-node-metrics-cert\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918960 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-openvswitch\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.919023 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-script-lib\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.919062 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-netd\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.919100 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-systemd\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.919141 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-kubelet\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.919187 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-config\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.919226 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-node-log\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.919301 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-log-socket\") pod \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\" (UID: \"72a13a81-4c11-4529-8a3d-2dd3c73215a7\") " Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.919743 4994 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.918476 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.919978 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.920049 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.920063 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.920110 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-slash" (OuterVolumeSpecName: "host-slash") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.920084 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.920962 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.921071 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.921540 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.921113 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.921309 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-log-socket" (OuterVolumeSpecName: "log-socket") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.921610 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-node-log" (OuterVolumeSpecName: "node-log") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.921657 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.921683 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.921946 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.922009 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.929798 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72a13a81-4c11-4529-8a3d-2dd3c73215a7-kube-api-access-s42gc" (OuterVolumeSpecName: "kube-api-access-s42gc") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "kube-api-access-s42gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.933230 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.941782 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d5dml"] Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942378 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6471fd89-1c92-498d-ba15-149418259c58" containerName="oc" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942424 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6471fd89-1c92-498d-ba15-149418259c58" containerName="oc" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942450 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kube-rbac-proxy-node" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942468 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kube-rbac-proxy-node" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942492 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="nbdb" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942510 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="nbdb" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942537 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="northd" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942554 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="northd" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942581 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942601 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942624 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942642 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942666 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kubecfg-setup" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942683 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kubecfg-setup" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942706 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942726 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942752 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942770 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942792 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="sbdb" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942809 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="sbdb" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942837 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovn-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942855 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovn-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.942969 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovn-acl-logging" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.942992 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovn-acl-logging" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943246 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943285 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovn-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943309 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="northd" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943335 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="kube-rbac-proxy-node" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943354 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943370 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943395 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="6471fd89-1c92-498d-ba15-149418259c58" containerName="oc" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943419 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovn-acl-logging" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943443 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943464 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="nbdb" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943486 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943505 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="sbdb" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.943779 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943805 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: E0310 00:18:18.943836 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.943854 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.944067 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerName="ovnkube-controller" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.947123 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:18 crc kubenswrapper[4994]: I0310 00:18:18.957588 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "72a13a81-4c11-4529-8a3d-2dd3c73215a7" (UID: "72a13a81-4c11-4529-8a3d-2dd3c73215a7"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.020834 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-run-ovn-kubernetes\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.020956 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n4db\" (UniqueName: \"kubernetes.io/projected/43e9ccbc-21ed-4371-8fde-cd3728441d1e-kube-api-access-5n4db\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021033 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43e9ccbc-21ed-4371-8fde-cd3728441d1e-env-overrides\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021158 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43e9ccbc-21ed-4371-8fde-cd3728441d1e-ovnkube-script-lib\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021212 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-cni-bin\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021297 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-run-systemd\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021349 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-etc-openvswitch\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021425 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-node-log\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021486 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43e9ccbc-21ed-4371-8fde-cd3728441d1e-ovnkube-config\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021530 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-run-netns\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021577 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-run-ovn\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021623 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-kubelet\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021739 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43e9ccbc-21ed-4371-8fde-cd3728441d1e-ovn-node-metrics-cert\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021812 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-run-openvswitch\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021902 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-cni-netd\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.021971 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-slash\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022057 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-var-lib-openvswitch\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022123 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022186 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-systemd-units\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022239 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-log-socket\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022344 4994 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022386 4994 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-slash\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022414 4994 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022440 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s42gc\" (UniqueName: \"kubernetes.io/projected/72a13a81-4c11-4529-8a3d-2dd3c73215a7-kube-api-access-s42gc\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022467 4994 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022494 4994 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022522 4994 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022546 4994 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022571 4994 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022595 4994 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022619 4994 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022645 4994 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022671 4994 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022796 4994 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022849 4994 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/72a13a81-4c11-4529-8a3d-2dd3c73215a7-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022909 4994 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-node-log\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022938 4994 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-log-socket\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022968 4994 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.022988 4994 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72a13a81-4c11-4529-8a3d-2dd3c73215a7-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.124739 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-run-systemd\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.124798 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-etc-openvswitch\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.124837 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-node-log\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.124900 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43e9ccbc-21ed-4371-8fde-cd3728441d1e-ovnkube-config\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.124931 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-run-netns\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.124967 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-run-ovn\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.124989 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-node-log\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125007 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-kubelet\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.124978 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-run-systemd\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125035 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-run-netns\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125094 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43e9ccbc-21ed-4371-8fde-cd3728441d1e-ovn-node-metrics-cert\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125112 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-run-ovn\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125137 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-kubelet\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125158 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-run-openvswitch\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125215 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-run-openvswitch\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125257 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-cni-netd\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125323 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-slash\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125420 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-var-lib-openvswitch\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125472 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-slash\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125482 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125552 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125611 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-systemd-units\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125625 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-var-lib-openvswitch\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125424 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-cni-netd\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125557 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-systemd-units\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125721 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-log-socket\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125773 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-run-ovn-kubernetes\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125813 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-log-socket\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125826 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n4db\" (UniqueName: \"kubernetes.io/projected/43e9ccbc-21ed-4371-8fde-cd3728441d1e-kube-api-access-5n4db\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.125949 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43e9ccbc-21ed-4371-8fde-cd3728441d1e-env-overrides\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.126037 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43e9ccbc-21ed-4371-8fde-cd3728441d1e-ovnkube-script-lib\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.126087 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-cni-bin\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.126206 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-cni-bin\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.126221 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-host-run-ovn-kubernetes\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.126497 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43e9ccbc-21ed-4371-8fde-cd3728441d1e-ovnkube-config\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.126602 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43e9ccbc-21ed-4371-8fde-cd3728441d1e-etc-openvswitch\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.127125 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43e9ccbc-21ed-4371-8fde-cd3728441d1e-env-overrides\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.128339 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43e9ccbc-21ed-4371-8fde-cd3728441d1e-ovnkube-script-lib\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.134782 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43e9ccbc-21ed-4371-8fde-cd3728441d1e-ovn-node-metrics-cert\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.155623 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n4db\" (UniqueName: \"kubernetes.io/projected/43e9ccbc-21ed-4371-8fde-cd3728441d1e-kube-api-access-5n4db\") pod \"ovnkube-node-d5dml\" (UID: \"43e9ccbc-21ed-4371-8fde-cd3728441d1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.276013 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:19 crc kubenswrapper[4994]: W0310 00:18:19.307151 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43e9ccbc_21ed_4371_8fde_cd3728441d1e.slice/crio-5bb57f5d24154ee36a367a988551ee8d76bdccc9de1caa9dee5df00900d09885 WatchSource:0}: Error finding container 5bb57f5d24154ee36a367a988551ee8d76bdccc9de1caa9dee5df00900d09885: Status 404 returned error can't find the container with id 5bb57f5d24154ee36a367a988551ee8d76bdccc9de1caa9dee5df00900d09885 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.391264 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerStarted","Data":"5bb57f5d24154ee36a367a988551ee8d76bdccc9de1caa9dee5df00900d09885"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.396897 4994 generic.go:334] "Generic (PLEG): container finished" podID="ced5d66d-39df-4267-b801-e1e60d517ace" containerID="3a58b30808ba3fd3b4a9259ace5f595c7d1bd5910098d24eb4a7ede149499cfa" exitCode=0 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.397097 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerDied","Data":"3a58b30808ba3fd3b4a9259ace5f595c7d1bd5910098d24eb4a7ede149499cfa"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.397353 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"226ba5dd02665930d82054325a4e53e30bff51d1812a48cc472d4cc84e6237db"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.397447 4994 scope.go:117] "RemoveContainer" containerID="e0758e06b050c72a5ba1ca15578add547da69884a82997478d49f051fd653d6f" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.401595 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/2.log" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.405139 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/1.log" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.405235 4994 generic.go:334] "Generic (PLEG): container finished" podID="6dac87a5-07eb-488d-85fe-cb8848434ae5" containerID="d5d956a023e0ec491c4113a78f5143267fd1a96d627be08eb78b078d22c69f89" exitCode=2 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.405374 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mcxcb" event={"ID":"6dac87a5-07eb-488d-85fe-cb8848434ae5","Type":"ContainerDied","Data":"d5d956a023e0ec491c4113a78f5143267fd1a96d627be08eb78b078d22c69f89"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.406391 4994 scope.go:117] "RemoveContainer" containerID="d5d956a023e0ec491c4113a78f5143267fd1a96d627be08eb78b078d22c69f89" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.406960 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mcxcb_openshift-multus(6dac87a5-07eb-488d-85fe-cb8848434ae5)\"" pod="openshift-multus/multus-mcxcb" podUID="6dac87a5-07eb-488d-85fe-cb8848434ae5" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.409632 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovnkube-controller/3.log" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.414178 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovn-acl-logging/0.log" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.415051 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ns797_72a13a81-4c11-4529-8a3d-2dd3c73215a7/ovn-controller/0.log" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.415784 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9" exitCode=0 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.415865 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1" exitCode=0 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.415948 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c" exitCode=0 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416008 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f" exitCode=0 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416069 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85" exitCode=0 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416126 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b" exitCode=0 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416187 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700" exitCode=143 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416248 4994 generic.go:334] "Generic (PLEG): container finished" podID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" containerID="ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd" exitCode=143 Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.415818 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416388 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.415962 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416454 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416638 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416682 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416705 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416727 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416744 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416756 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416767 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416778 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416803 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416815 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416827 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416840 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416851 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416863 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416940 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416958 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416972 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416983 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.416994 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417004 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417015 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417025 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417036 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417046 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417057 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417071 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417088 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417101 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417111 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417122 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417132 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417142 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417153 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417163 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417173 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417186 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417199 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ns797" event={"ID":"72a13a81-4c11-4529-8a3d-2dd3c73215a7","Type":"ContainerDied","Data":"b8da989a4363394b0ce6c6c658409a1fc0f3d0c82d2ec6c0704e2ab145277cf7"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417215 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417228 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417240 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417251 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417262 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417273 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417284 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417294 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417305 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.417315 4994 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6"} Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.468794 4994 scope.go:117] "RemoveContainer" containerID="04275233c9bc1fa90810612d10d8f0421cfe057942537f0390e5f7f48edef106" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.486439 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ns797"] Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.495306 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ns797"] Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.500350 4994 scope.go:117] "RemoveContainer" containerID="69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.570712 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.593549 4994 scope.go:117] "RemoveContainer" containerID="9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.613421 4994 scope.go:117] "RemoveContainer" containerID="80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.666262 4994 scope.go:117] "RemoveContainer" containerID="d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.685094 4994 scope.go:117] "RemoveContainer" containerID="922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.700604 4994 scope.go:117] "RemoveContainer" containerID="f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.717228 4994 scope.go:117] "RemoveContainer" containerID="9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.732758 4994 scope.go:117] "RemoveContainer" containerID="ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.747960 4994 scope.go:117] "RemoveContainer" containerID="dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.769291 4994 scope.go:117] "RemoveContainer" containerID="69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.769850 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9\": container with ID starting with 69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9 not found: ID does not exist" containerID="69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.769926 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} err="failed to get container status \"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9\": rpc error: code = NotFound desc = could not find container \"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9\": container with ID starting with 69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.769961 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.770585 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\": container with ID starting with c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e not found: ID does not exist" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.770627 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} err="failed to get container status \"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\": rpc error: code = NotFound desc = could not find container \"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\": container with ID starting with c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.770654 4994 scope.go:117] "RemoveContainer" containerID="9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.771168 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\": container with ID starting with 9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1 not found: ID does not exist" containerID="9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.771335 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} err="failed to get container status \"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\": rpc error: code = NotFound desc = could not find container \"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\": container with ID starting with 9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.771427 4994 scope.go:117] "RemoveContainer" containerID="80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.772097 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\": container with ID starting with 80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c not found: ID does not exist" containerID="80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.772139 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} err="failed to get container status \"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\": rpc error: code = NotFound desc = could not find container \"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\": container with ID starting with 80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.772168 4994 scope.go:117] "RemoveContainer" containerID="d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.772719 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\": container with ID starting with d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f not found: ID does not exist" containerID="d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.772760 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} err="failed to get container status \"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\": rpc error: code = NotFound desc = could not find container \"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\": container with ID starting with d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.772787 4994 scope.go:117] "RemoveContainer" containerID="922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.773080 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\": container with ID starting with 922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85 not found: ID does not exist" containerID="922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.773124 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} err="failed to get container status \"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\": rpc error: code = NotFound desc = could not find container \"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\": container with ID starting with 922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.773154 4994 scope.go:117] "RemoveContainer" containerID="f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.773551 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\": container with ID starting with f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b not found: ID does not exist" containerID="f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.773591 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} err="failed to get container status \"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\": rpc error: code = NotFound desc = could not find container \"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\": container with ID starting with f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.773618 4994 scope.go:117] "RemoveContainer" containerID="9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.774018 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\": container with ID starting with 9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700 not found: ID does not exist" containerID="9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.774057 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} err="failed to get container status \"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\": rpc error: code = NotFound desc = could not find container \"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\": container with ID starting with 9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.774082 4994 scope.go:117] "RemoveContainer" containerID="ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.774427 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\": container with ID starting with ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd not found: ID does not exist" containerID="ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.774467 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} err="failed to get container status \"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\": rpc error: code = NotFound desc = could not find container \"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\": container with ID starting with ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.774494 4994 scope.go:117] "RemoveContainer" containerID="dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6" Mar 10 00:18:19 crc kubenswrapper[4994]: E0310 00:18:19.774836 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\": container with ID starting with dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6 not found: ID does not exist" containerID="dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.774925 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6"} err="failed to get container status \"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\": rpc error: code = NotFound desc = could not find container \"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\": container with ID starting with dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.774963 4994 scope.go:117] "RemoveContainer" containerID="69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.775334 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} err="failed to get container status \"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9\": rpc error: code = NotFound desc = could not find container \"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9\": container with ID starting with 69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.775387 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.775783 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} err="failed to get container status \"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\": rpc error: code = NotFound desc = could not find container \"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\": container with ID starting with c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.775820 4994 scope.go:117] "RemoveContainer" containerID="9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.776148 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} err="failed to get container status \"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\": rpc error: code = NotFound desc = could not find container \"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\": container with ID starting with 9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.776175 4994 scope.go:117] "RemoveContainer" containerID="80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.776431 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} err="failed to get container status \"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\": rpc error: code = NotFound desc = could not find container \"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\": container with ID starting with 80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.776465 4994 scope.go:117] "RemoveContainer" containerID="d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.776756 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} err="failed to get container status \"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\": rpc error: code = NotFound desc = could not find container \"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\": container with ID starting with d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.776791 4994 scope.go:117] "RemoveContainer" containerID="922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.777079 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} err="failed to get container status \"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\": rpc error: code = NotFound desc = could not find container \"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\": container with ID starting with 922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.777132 4994 scope.go:117] "RemoveContainer" containerID="f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.777512 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} err="failed to get container status \"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\": rpc error: code = NotFound desc = could not find container \"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\": container with ID starting with f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.777549 4994 scope.go:117] "RemoveContainer" containerID="9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.777835 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} err="failed to get container status \"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\": rpc error: code = NotFound desc = could not find container \"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\": container with ID starting with 9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.777869 4994 scope.go:117] "RemoveContainer" containerID="ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.778285 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} err="failed to get container status \"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\": rpc error: code = NotFound desc = could not find container \"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\": container with ID starting with ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.778317 4994 scope.go:117] "RemoveContainer" containerID="dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.778602 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6"} err="failed to get container status \"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\": rpc error: code = NotFound desc = could not find container \"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\": container with ID starting with dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.778634 4994 scope.go:117] "RemoveContainer" containerID="69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.778947 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} err="failed to get container status \"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9\": rpc error: code = NotFound desc = could not find container \"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9\": container with ID starting with 69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.778982 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.779275 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} err="failed to get container status \"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\": rpc error: code = NotFound desc = could not find container \"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\": container with ID starting with c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.779338 4994 scope.go:117] "RemoveContainer" containerID="9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.779629 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} err="failed to get container status \"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\": rpc error: code = NotFound desc = could not find container \"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\": container with ID starting with 9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.779661 4994 scope.go:117] "RemoveContainer" containerID="80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.780023 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} err="failed to get container status \"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\": rpc error: code = NotFound desc = could not find container \"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\": container with ID starting with 80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.780065 4994 scope.go:117] "RemoveContainer" containerID="d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.780380 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} err="failed to get container status \"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\": rpc error: code = NotFound desc = could not find container \"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\": container with ID starting with d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.780415 4994 scope.go:117] "RemoveContainer" containerID="922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.780725 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} err="failed to get container status \"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\": rpc error: code = NotFound desc = could not find container \"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\": container with ID starting with 922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.780852 4994 scope.go:117] "RemoveContainer" containerID="f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.781184 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} err="failed to get container status \"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\": rpc error: code = NotFound desc = could not find container \"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\": container with ID starting with f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.781232 4994 scope.go:117] "RemoveContainer" containerID="9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.781602 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} err="failed to get container status \"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\": rpc error: code = NotFound desc = could not find container \"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\": container with ID starting with 9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.781656 4994 scope.go:117] "RemoveContainer" containerID="ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.782073 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} err="failed to get container status \"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\": rpc error: code = NotFound desc = could not find container \"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\": container with ID starting with ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.782143 4994 scope.go:117] "RemoveContainer" containerID="dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.782481 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6"} err="failed to get container status \"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\": rpc error: code = NotFound desc = could not find container \"dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6\": container with ID starting with dbd31f838cabf5a804ce9ecf976fb9b833ec5c6453c058bccdf7b41724c9eba6 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.782530 4994 scope.go:117] "RemoveContainer" containerID="69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.783006 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9"} err="failed to get container status \"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9\": rpc error: code = NotFound desc = could not find container \"69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9\": container with ID starting with 69719bc2047cd8788e9fbce7aba40f2ace8159e1ebae7abec1e7b5106ee443f9 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.783061 4994 scope.go:117] "RemoveContainer" containerID="c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.783483 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e"} err="failed to get container status \"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\": rpc error: code = NotFound desc = could not find container \"c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e\": container with ID starting with c32ffa20aee68ca67d88f119b24393e1ae0b496fbbe41013c41fa9b62065a50e not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.783521 4994 scope.go:117] "RemoveContainer" containerID="9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.784134 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1"} err="failed to get container status \"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\": rpc error: code = NotFound desc = could not find container \"9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1\": container with ID starting with 9740b03ab066a9a81c1f723e02fc2ff8626f9583bdfed1e2501142f92332b7c1 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.784180 4994 scope.go:117] "RemoveContainer" containerID="80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.784701 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c"} err="failed to get container status \"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\": rpc error: code = NotFound desc = could not find container \"80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c\": container with ID starting with 80f8d97650f2fa5d59c493d51ad000f423e29ae6202fc65d854d2cdea4946b4c not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.784753 4994 scope.go:117] "RemoveContainer" containerID="d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.785323 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f"} err="failed to get container status \"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\": rpc error: code = NotFound desc = could not find container \"d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f\": container with ID starting with d9d0ff32df8d3a783f93477cd1064fc50785d2ba0ff718e05522701ce614643f not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.785347 4994 scope.go:117] "RemoveContainer" containerID="922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.785701 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85"} err="failed to get container status \"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\": rpc error: code = NotFound desc = could not find container \"922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85\": container with ID starting with 922f6475e507eed3c8430c65dac73d50bc42b142231322f7fa33c8ee12a4ac85 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.785742 4994 scope.go:117] "RemoveContainer" containerID="f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.786083 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b"} err="failed to get container status \"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\": rpc error: code = NotFound desc = could not find container \"f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b\": container with ID starting with f01ed2602d78b020dbb77e97d19984d85ad8a731d85bb32145aafa9b1175bd6b not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.786108 4994 scope.go:117] "RemoveContainer" containerID="9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.786419 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700"} err="failed to get container status \"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\": rpc error: code = NotFound desc = could not find container \"9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700\": container with ID starting with 9a362b4e22ed59add8433c1b6add565510c60bdc09068b33ed75e15880e86700 not found: ID does not exist" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.786470 4994 scope.go:117] "RemoveContainer" containerID="ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd" Mar 10 00:18:19 crc kubenswrapper[4994]: I0310 00:18:19.786783 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd"} err="failed to get container status \"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\": rpc error: code = NotFound desc = could not find container \"ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd\": container with ID starting with ba19ef808177113512bae9426aee705013816c3008a79e0041f0032bf18123bd not found: ID does not exist" Mar 10 00:18:20 crc kubenswrapper[4994]: I0310 00:18:20.430487 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/2.log" Mar 10 00:18:20 crc kubenswrapper[4994]: I0310 00:18:20.436396 4994 generic.go:334] "Generic (PLEG): container finished" podID="43e9ccbc-21ed-4371-8fde-cd3728441d1e" containerID="7c9e2f1a80a6409196282c7bb495c555806137480f8f7bcf8d441fd872a4edff" exitCode=0 Mar 10 00:18:20 crc kubenswrapper[4994]: I0310 00:18:20.436452 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerDied","Data":"7c9e2f1a80a6409196282c7bb495c555806137480f8f7bcf8d441fd872a4edff"} Mar 10 00:18:20 crc kubenswrapper[4994]: I0310 00:18:20.568195 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72a13a81-4c11-4529-8a3d-2dd3c73215a7" path="/var/lib/kubelet/pods/72a13a81-4c11-4529-8a3d-2dd3c73215a7/volumes" Mar 10 00:18:21 crc kubenswrapper[4994]: I0310 00:18:21.444763 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerStarted","Data":"8c933bc21d307a9ac6a37d4b3ea905a4d8acaa7233ebeadadaf6c9e24814262d"} Mar 10 00:18:21 crc kubenswrapper[4994]: I0310 00:18:21.445020 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerStarted","Data":"30fb89fc99ef3646d6a83219d5ce24d8db0963cdce1a452324bf0f9fbca26a8a"} Mar 10 00:18:21 crc kubenswrapper[4994]: I0310 00:18:21.445052 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerStarted","Data":"fb1cb8d837ed458e2c85e23928d18be2cac1a157c11de004378c33273389aa39"} Mar 10 00:18:21 crc kubenswrapper[4994]: I0310 00:18:21.445062 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerStarted","Data":"3e72f53f12212178b7de7534a4ff49f0fcb7820be27a1f8a1c7e666ebb545ec8"} Mar 10 00:18:21 crc kubenswrapper[4994]: I0310 00:18:21.445073 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerStarted","Data":"da4c02cb7f63028d262af9e65b97efacf94be0c058afe29dc17595a73977193c"} Mar 10 00:18:22 crc kubenswrapper[4994]: I0310 00:18:22.458422 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerStarted","Data":"be4aefd7f943184d51b01c6f8062d8bc9cc3336ab78d5c62b309198209983bcd"} Mar 10 00:18:24 crc kubenswrapper[4994]: I0310 00:18:24.478383 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerStarted","Data":"854352bac2746c64e929519cda8ed0d737bd0e49d96556d49bd53ccdb7f5c968"} Mar 10 00:18:26 crc kubenswrapper[4994]: I0310 00:18:26.495669 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" event={"ID":"43e9ccbc-21ed-4371-8fde-cd3728441d1e","Type":"ContainerStarted","Data":"1286a5909c570e5cdc4cfcbb1418a23628d9a1132f146abe27510f0b12cf4b74"} Mar 10 00:18:26 crc kubenswrapper[4994]: I0310 00:18:26.496623 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:26 crc kubenswrapper[4994]: I0310 00:18:26.496648 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:26 crc kubenswrapper[4994]: I0310 00:18:26.496667 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:26 crc kubenswrapper[4994]: I0310 00:18:26.535001 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:26 crc kubenswrapper[4994]: I0310 00:18:26.535593 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" podStartSLOduration=8.535574464 podStartE2EDuration="8.535574464s" podCreationTimestamp="2026-03-10 00:18:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:18:26.535392309 +0000 UTC m=+720.709099048" watchObservedRunningTime="2026-03-10 00:18:26.535574464 +0000 UTC m=+720.709281233" Mar 10 00:18:26 crc kubenswrapper[4994]: I0310 00:18:26.536733 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:18:33 crc kubenswrapper[4994]: I0310 00:18:33.553672 4994 scope.go:117] "RemoveContainer" containerID="d5d956a023e0ec491c4113a78f5143267fd1a96d627be08eb78b078d22c69f89" Mar 10 00:18:33 crc kubenswrapper[4994]: E0310 00:18:33.554403 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mcxcb_openshift-multus(6dac87a5-07eb-488d-85fe-cb8848434ae5)\"" pod="openshift-multus/multus-mcxcb" podUID="6dac87a5-07eb-488d-85fe-cb8848434ae5" Mar 10 00:18:45 crc kubenswrapper[4994]: I0310 00:18:45.554543 4994 scope.go:117] "RemoveContainer" containerID="d5d956a023e0ec491c4113a78f5143267fd1a96d627be08eb78b078d22c69f89" Mar 10 00:18:46 crc kubenswrapper[4994]: I0310 00:18:46.639373 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mcxcb_6dac87a5-07eb-488d-85fe-cb8848434ae5/kube-multus/2.log" Mar 10 00:18:46 crc kubenswrapper[4994]: I0310 00:18:46.639792 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mcxcb" event={"ID":"6dac87a5-07eb-488d-85fe-cb8848434ae5","Type":"ContainerStarted","Data":"2eeab3e0d6126ae7f064e1fd14955fd188a04b0e8b1213a0253d264716e50167"} Mar 10 00:18:49 crc kubenswrapper[4994]: I0310 00:18:49.315114 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d5dml" Mar 10 00:19:14 crc kubenswrapper[4994]: I0310 00:19:14.727926 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2b884"] Mar 10 00:19:14 crc kubenswrapper[4994]: I0310 00:19:14.728991 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2b884" podUID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerName="registry-server" containerID="cri-o://025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138" gracePeriod=30 Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.124173 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.253319 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-catalog-content\") pod \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.253483 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-utilities\") pod \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.253557 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqxvd\" (UniqueName: \"kubernetes.io/projected/33f42081-c92d-42cd-90b5-329a5ae6c2ad-kube-api-access-cqxvd\") pod \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\" (UID: \"33f42081-c92d-42cd-90b5-329a5ae6c2ad\") " Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.254995 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-utilities" (OuterVolumeSpecName: "utilities") pod "33f42081-c92d-42cd-90b5-329a5ae6c2ad" (UID: "33f42081-c92d-42cd-90b5-329a5ae6c2ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.263237 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33f42081-c92d-42cd-90b5-329a5ae6c2ad-kube-api-access-cqxvd" (OuterVolumeSpecName: "kube-api-access-cqxvd") pod "33f42081-c92d-42cd-90b5-329a5ae6c2ad" (UID: "33f42081-c92d-42cd-90b5-329a5ae6c2ad"). InnerVolumeSpecName "kube-api-access-cqxvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.306252 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33f42081-c92d-42cd-90b5-329a5ae6c2ad" (UID: "33f42081-c92d-42cd-90b5-329a5ae6c2ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.355318 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.355381 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f42081-c92d-42cd-90b5-329a5ae6c2ad-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.355407 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqxvd\" (UniqueName: \"kubernetes.io/projected/33f42081-c92d-42cd-90b5-329a5ae6c2ad-kube-api-access-cqxvd\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.847070 4994 generic.go:334] "Generic (PLEG): container finished" podID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerID="025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138" exitCode=0 Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.847132 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2b884" event={"ID":"33f42081-c92d-42cd-90b5-329a5ae6c2ad","Type":"ContainerDied","Data":"025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138"} Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.847180 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2b884" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.847203 4994 scope.go:117] "RemoveContainer" containerID="025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.847186 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2b884" event={"ID":"33f42081-c92d-42cd-90b5-329a5ae6c2ad","Type":"ContainerDied","Data":"2092117d1a588bddb0afa93402f75947e71f54b402d21d698eb1a68abf48a5d8"} Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.875594 4994 scope.go:117] "RemoveContainer" containerID="d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.892152 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2b884"] Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.900209 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2b884"] Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.910662 4994 scope.go:117] "RemoveContainer" containerID="9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.942701 4994 scope.go:117] "RemoveContainer" containerID="025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138" Mar 10 00:19:15 crc kubenswrapper[4994]: E0310 00:19:15.943454 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138\": container with ID starting with 025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138 not found: ID does not exist" containerID="025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.943501 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138"} err="failed to get container status \"025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138\": rpc error: code = NotFound desc = could not find container \"025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138\": container with ID starting with 025e09178933b55f1c2b7f4230657c8b2a184772137a5c391eff5bca6c7f3138 not found: ID does not exist" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.943532 4994 scope.go:117] "RemoveContainer" containerID="d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a" Mar 10 00:19:15 crc kubenswrapper[4994]: E0310 00:19:15.944041 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a\": container with ID starting with d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a not found: ID does not exist" containerID="d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.944241 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a"} err="failed to get container status \"d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a\": rpc error: code = NotFound desc = could not find container \"d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a\": container with ID starting with d4254cf5b0facc6f6fdf4eb104d4860b2a700ccd48a5c29a108e050c7aa09f5a not found: ID does not exist" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.944401 4994 scope.go:117] "RemoveContainer" containerID="9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b" Mar 10 00:19:15 crc kubenswrapper[4994]: E0310 00:19:15.945146 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b\": container with ID starting with 9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b not found: ID does not exist" containerID="9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b" Mar 10 00:19:15 crc kubenswrapper[4994]: I0310 00:19:15.945368 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b"} err="failed to get container status \"9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b\": rpc error: code = NotFound desc = could not find container \"9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b\": container with ID starting with 9e1ecfe970695f771c62a2678eb4f24dee572f03ffddf256f488af7dd8a9190b not found: ID does not exist" Mar 10 00:19:16 crc kubenswrapper[4994]: I0310 00:19:16.567155 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" path="/var/lib/kubelet/pods/33f42081-c92d-42cd-90b5-329a5ae6c2ad/volumes" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.729662 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz"] Mar 10 00:19:18 crc kubenswrapper[4994]: E0310 00:19:18.729973 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerName="registry-server" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.729994 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerName="registry-server" Mar 10 00:19:18 crc kubenswrapper[4994]: E0310 00:19:18.730028 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerName="extract-content" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.730040 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerName="extract-content" Mar 10 00:19:18 crc kubenswrapper[4994]: E0310 00:19:18.730060 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerName="extract-utilities" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.730072 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerName="extract-utilities" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.730234 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="33f42081-c92d-42cd-90b5-329a5ae6c2ad" containerName="registry-server" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.731482 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.733947 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.740830 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz"] Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.815895 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.816057 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.816332 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnfms\" (UniqueName: \"kubernetes.io/projected/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-kube-api-access-wnfms\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.918006 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.918237 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnfms\" (UniqueName: \"kubernetes.io/projected/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-kube-api-access-wnfms\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.918361 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.918506 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.918761 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:18 crc kubenswrapper[4994]: I0310 00:19:18.950716 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnfms\" (UniqueName: \"kubernetes.io/projected/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-kube-api-access-wnfms\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:19 crc kubenswrapper[4994]: I0310 00:19:19.059835 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:19 crc kubenswrapper[4994]: I0310 00:19:19.372028 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz"] Mar 10 00:19:19 crc kubenswrapper[4994]: I0310 00:19:19.876478 4994 generic.go:334] "Generic (PLEG): container finished" podID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerID="639c9df1d48af8cdff7b0f3a99a12f6945e57821435616c558a6555fc233701d" exitCode=0 Mar 10 00:19:19 crc kubenswrapper[4994]: I0310 00:19:19.876550 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" event={"ID":"4dea22bc-f7b5-4722-b2c2-db96edfdcb96","Type":"ContainerDied","Data":"639c9df1d48af8cdff7b0f3a99a12f6945e57821435616c558a6555fc233701d"} Mar 10 00:19:19 crc kubenswrapper[4994]: I0310 00:19:19.876595 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" event={"ID":"4dea22bc-f7b5-4722-b2c2-db96edfdcb96","Type":"ContainerStarted","Data":"04229c26bff1a539582938656d7ebc1ccc0f230691b0261c551ec8981379d6fe"} Mar 10 00:19:21 crc kubenswrapper[4994]: I0310 00:19:21.897505 4994 generic.go:334] "Generic (PLEG): container finished" podID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerID="2a4772e5d3aa0dd231761f31dc60bd9c3f635f27324c3b160d231ace54e31d4e" exitCode=0 Mar 10 00:19:21 crc kubenswrapper[4994]: I0310 00:19:21.897634 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" event={"ID":"4dea22bc-f7b5-4722-b2c2-db96edfdcb96","Type":"ContainerDied","Data":"2a4772e5d3aa0dd231761f31dc60bd9c3f635f27324c3b160d231ace54e31d4e"} Mar 10 00:19:22 crc kubenswrapper[4994]: I0310 00:19:22.907272 4994 generic.go:334] "Generic (PLEG): container finished" podID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerID="ff7f02a03f07a7a049b25103440fb388fb7f59c7728ad51beca848d3a3f413c9" exitCode=0 Mar 10 00:19:22 crc kubenswrapper[4994]: I0310 00:19:22.907341 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" event={"ID":"4dea22bc-f7b5-4722-b2c2-db96edfdcb96","Type":"ContainerDied","Data":"ff7f02a03f07a7a049b25103440fb388fb7f59c7728ad51beca848d3a3f413c9"} Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.186359 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.287193 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-bundle\") pod \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.287272 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnfms\" (UniqueName: \"kubernetes.io/projected/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-kube-api-access-wnfms\") pod \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.287356 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-util\") pod \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\" (UID: \"4dea22bc-f7b5-4722-b2c2-db96edfdcb96\") " Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.290992 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-bundle" (OuterVolumeSpecName: "bundle") pod "4dea22bc-f7b5-4722-b2c2-db96edfdcb96" (UID: "4dea22bc-f7b5-4722-b2c2-db96edfdcb96"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.293350 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-kube-api-access-wnfms" (OuterVolumeSpecName: "kube-api-access-wnfms") pod "4dea22bc-f7b5-4722-b2c2-db96edfdcb96" (UID: "4dea22bc-f7b5-4722-b2c2-db96edfdcb96"). InnerVolumeSpecName "kube-api-access-wnfms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.305533 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-util" (OuterVolumeSpecName: "util") pod "4dea22bc-f7b5-4722-b2c2-db96edfdcb96" (UID: "4dea22bc-f7b5-4722-b2c2-db96edfdcb96"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.421805 4994 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.421861 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnfms\" (UniqueName: \"kubernetes.io/projected/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-kube-api-access-wnfms\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.421908 4994 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4dea22bc-f7b5-4722-b2c2-db96edfdcb96-util\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.925402 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" event={"ID":"4dea22bc-f7b5-4722-b2c2-db96edfdcb96","Type":"ContainerDied","Data":"04229c26bff1a539582938656d7ebc1ccc0f230691b0261c551ec8981379d6fe"} Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.925466 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04229c26bff1a539582938656d7ebc1ccc0f230691b0261c551ec8981379d6fe" Mar 10 00:19:24 crc kubenswrapper[4994]: I0310 00:19:24.925486 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.140481 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw"] Mar 10 00:19:25 crc kubenswrapper[4994]: E0310 00:19:25.140994 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerName="pull" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.141062 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerName="pull" Mar 10 00:19:25 crc kubenswrapper[4994]: E0310 00:19:25.141088 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerName="extract" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.141107 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerName="extract" Mar 10 00:19:25 crc kubenswrapper[4994]: E0310 00:19:25.141138 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerName="util" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.141156 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerName="util" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.141362 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dea22bc-f7b5-4722-b2c2-db96edfdcb96" containerName="extract" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.142768 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.153101 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.155931 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw"] Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.236292 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knbzz\" (UniqueName: \"kubernetes.io/projected/c792896d-13dd-4202-a2b7-62aac3396c78-kube-api-access-knbzz\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.236643 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.236763 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.337535 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.337627 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.337690 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knbzz\" (UniqueName: \"kubernetes.io/projected/c792896d-13dd-4202-a2b7-62aac3396c78-kube-api-access-knbzz\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.338685 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.338972 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.375327 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knbzz\" (UniqueName: \"kubernetes.io/projected/c792896d-13dd-4202-a2b7-62aac3396c78-kube-api-access-knbzz\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.466784 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.825381 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw"] Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.937614 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" event={"ID":"c792896d-13dd-4202-a2b7-62aac3396c78","Type":"ContainerStarted","Data":"acb9dedd7f509cbae847d1e88acbbfd13d5081e980ec7de42a96bc80f913649f"} Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.941997 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk"] Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.945247 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.949438 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.949610 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twhrf\" (UniqueName: \"kubernetes.io/projected/b4b3e4dd-b86b-4442-9067-233a79e7942e-kube-api-access-twhrf\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.949665 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:25 crc kubenswrapper[4994]: I0310 00:19:25.954248 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk"] Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.051648 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twhrf\" (UniqueName: \"kubernetes.io/projected/b4b3e4dd-b86b-4442-9067-233a79e7942e-kube-api-access-twhrf\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.051732 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.051802 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.052325 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.052444 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.083789 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twhrf\" (UniqueName: \"kubernetes.io/projected/b4b3e4dd-b86b-4442-9067-233a79e7942e-kube-api-access-twhrf\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.261621 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.552320 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk"] Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.951022 4994 generic.go:334] "Generic (PLEG): container finished" podID="c792896d-13dd-4202-a2b7-62aac3396c78" containerID="2feceea9b45bb6f6540746677a00ccfd3f91ca6a64ff77d94b7eedf8e12a692a" exitCode=0 Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.951149 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" event={"ID":"c792896d-13dd-4202-a2b7-62aac3396c78","Type":"ContainerDied","Data":"2feceea9b45bb6f6540746677a00ccfd3f91ca6a64ff77d94b7eedf8e12a692a"} Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.955465 4994 generic.go:334] "Generic (PLEG): container finished" podID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerID="aa8beb4d0f2667b72d954c399b6fdc2c27721125c0776bf7190a24a0fef4c3e8" exitCode=0 Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.955514 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" event={"ID":"b4b3e4dd-b86b-4442-9067-233a79e7942e","Type":"ContainerDied","Data":"aa8beb4d0f2667b72d954c399b6fdc2c27721125c0776bf7190a24a0fef4c3e8"} Mar 10 00:19:26 crc kubenswrapper[4994]: I0310 00:19:26.955758 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" event={"ID":"b4b3e4dd-b86b-4442-9067-233a79e7942e","Type":"ContainerStarted","Data":"e5fe920b3ed139c55793dea0f6a34bc2501658b48e5c546414dd0c79da5357c8"} Mar 10 00:19:27 crc kubenswrapper[4994]: I0310 00:19:27.966068 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" event={"ID":"b4b3e4dd-b86b-4442-9067-233a79e7942e","Type":"ContainerStarted","Data":"6f55af6ab8604a0908f97e3a3b321cd44d205ac4b0ca29e2aadfa0dde099c357"} Mar 10 00:19:28 crc kubenswrapper[4994]: I0310 00:19:28.972176 4994 generic.go:334] "Generic (PLEG): container finished" podID="c792896d-13dd-4202-a2b7-62aac3396c78" containerID="a5678d823f4db87f8b63e9616486bc663448482931f43875bd878700c19d1efc" exitCode=0 Mar 10 00:19:28 crc kubenswrapper[4994]: I0310 00:19:28.972226 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" event={"ID":"c792896d-13dd-4202-a2b7-62aac3396c78","Type":"ContainerDied","Data":"a5678d823f4db87f8b63e9616486bc663448482931f43875bd878700c19d1efc"} Mar 10 00:19:28 crc kubenswrapper[4994]: I0310 00:19:28.974603 4994 generic.go:334] "Generic (PLEG): container finished" podID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerID="6f55af6ab8604a0908f97e3a3b321cd44d205ac4b0ca29e2aadfa0dde099c357" exitCode=0 Mar 10 00:19:28 crc kubenswrapper[4994]: I0310 00:19:28.974642 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" event={"ID":"b4b3e4dd-b86b-4442-9067-233a79e7942e","Type":"ContainerDied","Data":"6f55af6ab8604a0908f97e3a3b321cd44d205ac4b0ca29e2aadfa0dde099c357"} Mar 10 00:19:29 crc kubenswrapper[4994]: I0310 00:19:29.982098 4994 generic.go:334] "Generic (PLEG): container finished" podID="c792896d-13dd-4202-a2b7-62aac3396c78" containerID="b3a3a146058c89dd59af81116cd5cc2193337b841ba1b804decdab6758ca6140" exitCode=0 Mar 10 00:19:29 crc kubenswrapper[4994]: I0310 00:19:29.982172 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" event={"ID":"c792896d-13dd-4202-a2b7-62aac3396c78","Type":"ContainerDied","Data":"b3a3a146058c89dd59af81116cd5cc2193337b841ba1b804decdab6758ca6140"} Mar 10 00:19:29 crc kubenswrapper[4994]: I0310 00:19:29.984645 4994 generic.go:334] "Generic (PLEG): container finished" podID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerID="742fa152f8a71a3355fd939e60aea365e7902312bbccb07ad4c3ff171ed266b2" exitCode=0 Mar 10 00:19:29 crc kubenswrapper[4994]: I0310 00:19:29.984683 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" event={"ID":"b4b3e4dd-b86b-4442-9067-233a79e7942e","Type":"ContainerDied","Data":"742fa152f8a71a3355fd939e60aea365e7902312bbccb07ad4c3ff171ed266b2"} Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.324815 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.335759 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.421316 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-bundle\") pod \"b4b3e4dd-b86b-4442-9067-233a79e7942e\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.421377 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-util\") pod \"c792896d-13dd-4202-a2b7-62aac3396c78\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.421404 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twhrf\" (UniqueName: \"kubernetes.io/projected/b4b3e4dd-b86b-4442-9067-233a79e7942e-kube-api-access-twhrf\") pod \"b4b3e4dd-b86b-4442-9067-233a79e7942e\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.421435 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knbzz\" (UniqueName: \"kubernetes.io/projected/c792896d-13dd-4202-a2b7-62aac3396c78-kube-api-access-knbzz\") pod \"c792896d-13dd-4202-a2b7-62aac3396c78\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.421507 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-bundle\") pod \"c792896d-13dd-4202-a2b7-62aac3396c78\" (UID: \"c792896d-13dd-4202-a2b7-62aac3396c78\") " Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.421545 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-util\") pod \"b4b3e4dd-b86b-4442-9067-233a79e7942e\" (UID: \"b4b3e4dd-b86b-4442-9067-233a79e7942e\") " Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.422294 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-bundle" (OuterVolumeSpecName: "bundle") pod "c792896d-13dd-4202-a2b7-62aac3396c78" (UID: "c792896d-13dd-4202-a2b7-62aac3396c78"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.422602 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-bundle" (OuterVolumeSpecName: "bundle") pod "b4b3e4dd-b86b-4442-9067-233a79e7942e" (UID: "b4b3e4dd-b86b-4442-9067-233a79e7942e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.443077 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c792896d-13dd-4202-a2b7-62aac3396c78-kube-api-access-knbzz" (OuterVolumeSpecName: "kube-api-access-knbzz") pod "c792896d-13dd-4202-a2b7-62aac3396c78" (UID: "c792896d-13dd-4202-a2b7-62aac3396c78"). InnerVolumeSpecName "kube-api-access-knbzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.443166 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4b3e4dd-b86b-4442-9067-233a79e7942e-kube-api-access-twhrf" (OuterVolumeSpecName: "kube-api-access-twhrf") pod "b4b3e4dd-b86b-4442-9067-233a79e7942e" (UID: "b4b3e4dd-b86b-4442-9067-233a79e7942e"). InnerVolumeSpecName "kube-api-access-twhrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.456500 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-util" (OuterVolumeSpecName: "util") pod "b4b3e4dd-b86b-4442-9067-233a79e7942e" (UID: "b4b3e4dd-b86b-4442-9067-233a79e7942e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.522723 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knbzz\" (UniqueName: \"kubernetes.io/projected/c792896d-13dd-4202-a2b7-62aac3396c78-kube-api-access-knbzz\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.522938 4994 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.522995 4994 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-util\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.523079 4994 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b4b3e4dd-b86b-4442-9067-233a79e7942e-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.523132 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twhrf\" (UniqueName: \"kubernetes.io/projected/b4b3e4dd-b86b-4442-9067-233a79e7942e-kube-api-access-twhrf\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.644615 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-util" (OuterVolumeSpecName: "util") pod "c792896d-13dd-4202-a2b7-62aac3396c78" (UID: "c792896d-13dd-4202-a2b7-62aac3396c78"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.725801 4994 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c792896d-13dd-4202-a2b7-62aac3396c78-util\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.995610 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" event={"ID":"b4b3e4dd-b86b-4442-9067-233a79e7942e","Type":"ContainerDied","Data":"e5fe920b3ed139c55793dea0f6a34bc2501658b48e5c546414dd0c79da5357c8"} Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.995652 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5fe920b3ed139c55793dea0f6a34bc2501658b48e5c546414dd0c79da5357c8" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.995626 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.997456 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" event={"ID":"c792896d-13dd-4202-a2b7-62aac3396c78","Type":"ContainerDied","Data":"acb9dedd7f509cbae847d1e88acbbfd13d5081e980ec7de42a96bc80f913649f"} Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.997511 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acb9dedd7f509cbae847d1e88acbbfd13d5081e980ec7de42a96bc80f913649f" Mar 10 00:19:31 crc kubenswrapper[4994]: I0310 00:19:31.997518 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.130482 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5"] Mar 10 00:19:34 crc kubenswrapper[4994]: E0310 00:19:34.130764 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c792896d-13dd-4202-a2b7-62aac3396c78" containerName="util" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.130784 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="c792896d-13dd-4202-a2b7-62aac3396c78" containerName="util" Mar 10 00:19:34 crc kubenswrapper[4994]: E0310 00:19:34.130806 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerName="util" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.130817 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerName="util" Mar 10 00:19:34 crc kubenswrapper[4994]: E0310 00:19:34.130837 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c792896d-13dd-4202-a2b7-62aac3396c78" containerName="extract" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.130849 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="c792896d-13dd-4202-a2b7-62aac3396c78" containerName="extract" Mar 10 00:19:34 crc kubenswrapper[4994]: E0310 00:19:34.130896 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerName="extract" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.130909 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerName="extract" Mar 10 00:19:34 crc kubenswrapper[4994]: E0310 00:19:34.130927 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c792896d-13dd-4202-a2b7-62aac3396c78" containerName="pull" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.130937 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="c792896d-13dd-4202-a2b7-62aac3396c78" containerName="pull" Mar 10 00:19:34 crc kubenswrapper[4994]: E0310 00:19:34.130949 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerName="pull" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.130960 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerName="pull" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.131107 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b3e4dd-b86b-4442-9067-233a79e7942e" containerName="extract" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.131132 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="c792896d-13dd-4202-a2b7-62aac3396c78" containerName="extract" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.132250 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.133931 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.145916 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.162534 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd5fn\" (UniqueName: \"kubernetes.io/projected/1b75d3a9-a107-4c28-afc2-7eb7e1357113-kube-api-access-bd5fn\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.162600 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.162737 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.264715 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd5fn\" (UniqueName: \"kubernetes.io/projected/1b75d3a9-a107-4c28-afc2-7eb7e1357113-kube-api-access-bd5fn\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.264846 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.264976 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.265953 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.265961 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.286614 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd5fn\" (UniqueName: \"kubernetes.io/projected/1b75d3a9-a107-4c28-afc2-7eb7e1357113-kube-api-access-bd5fn\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.454409 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.615809 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.616763 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.618784 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.619020 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.629033 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-rgfvq" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.634909 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.670995 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtvpv\" (UniqueName: \"kubernetes.io/projected/13e52713-fbfe-43ba-ae51-b13a060d8a05-kube-api-access-vtvpv\") pod \"obo-prometheus-operator-68bc856cb9-fnj29\" (UID: \"13e52713-fbfe-43ba-ae51-b13a060d8a05\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.696020 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.741959 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.742853 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.745918 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.750165 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-qqfzs" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.759418 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.760097 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.768435 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.771813 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/08b7eb36-ad76-4d9a-9fe9-f37febcdfdab-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw\" (UID: \"08b7eb36-ad76-4d9a-9fe9-f37febcdfdab\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.771880 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/08b7eb36-ad76-4d9a-9fe9-f37febcdfdab-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw\" (UID: \"08b7eb36-ad76-4d9a-9fe9-f37febcdfdab\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.771919 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtvpv\" (UniqueName: \"kubernetes.io/projected/13e52713-fbfe-43ba-ae51-b13a060d8a05-kube-api-access-vtvpv\") pod \"obo-prometheus-operator-68bc856cb9-fnj29\" (UID: \"13e52713-fbfe-43ba-ae51-b13a060d8a05\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.775100 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.805654 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtvpv\" (UniqueName: \"kubernetes.io/projected/13e52713-fbfe-43ba-ae51-b13a060d8a05-kube-api-access-vtvpv\") pod \"obo-prometheus-operator-68bc856cb9-fnj29\" (UID: \"13e52713-fbfe-43ba-ae51-b13a060d8a05\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.876394 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22d07ce7-cdcc-4804-8127-a4f3a9d1685f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r\" (UID: \"22d07ce7-cdcc-4804-8127-a4f3a9d1685f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.876695 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22d07ce7-cdcc-4804-8127-a4f3a9d1685f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r\" (UID: \"22d07ce7-cdcc-4804-8127-a4f3a9d1685f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.876740 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/08b7eb36-ad76-4d9a-9fe9-f37febcdfdab-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw\" (UID: \"08b7eb36-ad76-4d9a-9fe9-f37febcdfdab\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.876777 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/08b7eb36-ad76-4d9a-9fe9-f37febcdfdab-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw\" (UID: \"08b7eb36-ad76-4d9a-9fe9-f37febcdfdab\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.881315 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/08b7eb36-ad76-4d9a-9fe9-f37febcdfdab-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw\" (UID: \"08b7eb36-ad76-4d9a-9fe9-f37febcdfdab\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.887578 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/08b7eb36-ad76-4d9a-9fe9-f37febcdfdab-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw\" (UID: \"08b7eb36-ad76-4d9a-9fe9-f37febcdfdab\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.942269 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.943926 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-2jk2w"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.944594 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.946904 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-922cp" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.948529 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.958283 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-2jk2w"] Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.981406 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e-observability-operator-tls\") pod \"observability-operator-59bdc8b94-2jk2w\" (UID: \"9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e\") " pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.981461 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t82xb\" (UniqueName: \"kubernetes.io/projected/9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e-kube-api-access-t82xb\") pod \"observability-operator-59bdc8b94-2jk2w\" (UID: \"9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e\") " pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.981493 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22d07ce7-cdcc-4804-8127-a4f3a9d1685f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r\" (UID: \"22d07ce7-cdcc-4804-8127-a4f3a9d1685f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.981511 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22d07ce7-cdcc-4804-8127-a4f3a9d1685f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r\" (UID: \"22d07ce7-cdcc-4804-8127-a4f3a9d1685f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.984493 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22d07ce7-cdcc-4804-8127-a4f3a9d1685f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r\" (UID: \"22d07ce7-cdcc-4804-8127-a4f3a9d1685f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" Mar 10 00:19:34 crc kubenswrapper[4994]: I0310 00:19:34.988135 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22d07ce7-cdcc-4804-8127-a4f3a9d1685f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r\" (UID: \"22d07ce7-cdcc-4804-8127-a4f3a9d1685f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.017840 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" event={"ID":"1b75d3a9-a107-4c28-afc2-7eb7e1357113","Type":"ContainerStarted","Data":"a88bdcb4e90901bbacfd8ee24aa0a9a374eebc2ed6f9998f79f9db1119b33e20"} Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.017890 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" event={"ID":"1b75d3a9-a107-4c28-afc2-7eb7e1357113","Type":"ContainerStarted","Data":"d73cec1d6e71b20e6e921ec34546a853007d278d08029837454b216700528bdf"} Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.082468 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e-observability-operator-tls\") pod \"observability-operator-59bdc8b94-2jk2w\" (UID: \"9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e\") " pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.082532 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t82xb\" (UniqueName: \"kubernetes.io/projected/9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e-kube-api-access-t82xb\") pod \"observability-operator-59bdc8b94-2jk2w\" (UID: \"9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e\") " pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.086226 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e-observability-operator-tls\") pod \"observability-operator-59bdc8b94-2jk2w\" (UID: \"9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e\") " pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.086301 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-gxbhj"] Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.086962 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.098116 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-gxbhj"] Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.105623 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t82xb\" (UniqueName: \"kubernetes.io/projected/9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e-kube-api-access-t82xb\") pod \"observability-operator-59bdc8b94-2jk2w\" (UID: \"9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e\") " pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.107045 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.109183 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-6wpx6" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.133192 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.183533 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/65c6820f-4375-4de8-bcdf-0f0e2c4bcd87-openshift-service-ca\") pod \"perses-operator-5bf474d74f-gxbhj\" (UID: \"65c6820f-4375-4de8-bcdf-0f0e2c4bcd87\") " pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.183842 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cvd8\" (UniqueName: \"kubernetes.io/projected/65c6820f-4375-4de8-bcdf-0f0e2c4bcd87-kube-api-access-6cvd8\") pod \"perses-operator-5bf474d74f-gxbhj\" (UID: \"65c6820f-4375-4de8-bcdf-0f0e2c4bcd87\") " pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.257218 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.285244 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cvd8\" (UniqueName: \"kubernetes.io/projected/65c6820f-4375-4de8-bcdf-0f0e2c4bcd87-kube-api-access-6cvd8\") pod \"perses-operator-5bf474d74f-gxbhj\" (UID: \"65c6820f-4375-4de8-bcdf-0f0e2c4bcd87\") " pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.285358 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/65c6820f-4375-4de8-bcdf-0f0e2c4bcd87-openshift-service-ca\") pod \"perses-operator-5bf474d74f-gxbhj\" (UID: \"65c6820f-4375-4de8-bcdf-0f0e2c4bcd87\") " pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.287028 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/65c6820f-4375-4de8-bcdf-0f0e2c4bcd87-openshift-service-ca\") pod \"perses-operator-5bf474d74f-gxbhj\" (UID: \"65c6820f-4375-4de8-bcdf-0f0e2c4bcd87\") " pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.306056 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cvd8\" (UniqueName: \"kubernetes.io/projected/65c6820f-4375-4de8-bcdf-0f0e2c4bcd87-kube-api-access-6cvd8\") pod \"perses-operator-5bf474d74f-gxbhj\" (UID: \"65c6820f-4375-4de8-bcdf-0f0e2c4bcd87\") " pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.366443 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r"] Mar 10 00:19:35 crc kubenswrapper[4994]: W0310 00:19:35.383193 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22d07ce7_cdcc_4804_8127_a4f3a9d1685f.slice/crio-2816254c96065bbacbf1f79a20d126caa641e7b7d6c396eecc2b6928258b8f99 WatchSource:0}: Error finding container 2816254c96065bbacbf1f79a20d126caa641e7b7d6c396eecc2b6928258b8f99: Status 404 returned error can't find the container with id 2816254c96065bbacbf1f79a20d126caa641e7b7d6c396eecc2b6928258b8f99 Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.408528 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29"] Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.421333 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw"] Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.432459 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.531596 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-2jk2w"] Mar 10 00:19:35 crc kubenswrapper[4994]: W0310 00:19:35.538156 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ec2ef1a_309f_4d22_b9e7_c6536fb8a46e.slice/crio-ab41d7e04137fd8dc782ceb6a4de4dd97f27ec01ea6be5948504a7d82595a003 WatchSource:0}: Error finding container ab41d7e04137fd8dc782ceb6a4de4dd97f27ec01ea6be5948504a7d82595a003: Status 404 returned error can't find the container with id ab41d7e04137fd8dc782ceb6a4de4dd97f27ec01ea6be5948504a7d82595a003 Mar 10 00:19:35 crc kubenswrapper[4994]: I0310 00:19:35.622558 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-gxbhj"] Mar 10 00:19:35 crc kubenswrapper[4994]: W0310 00:19:35.629251 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65c6820f_4375_4de8_bcdf_0f0e2c4bcd87.slice/crio-3d6413977120c2da73c7d14d5d9dd5f31fb6a1df482e0cb53b4ed4214d80f951 WatchSource:0}: Error finding container 3d6413977120c2da73c7d14d5d9dd5f31fb6a1df482e0cb53b4ed4214d80f951: Status 404 returned error can't find the container with id 3d6413977120c2da73c7d14d5d9dd5f31fb6a1df482e0cb53b4ed4214d80f951 Mar 10 00:19:36 crc kubenswrapper[4994]: I0310 00:19:36.026027 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" event={"ID":"9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e","Type":"ContainerStarted","Data":"ab41d7e04137fd8dc782ceb6a4de4dd97f27ec01ea6be5948504a7d82595a003"} Mar 10 00:19:36 crc kubenswrapper[4994]: I0310 00:19:36.027767 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29" event={"ID":"13e52713-fbfe-43ba-ae51-b13a060d8a05","Type":"ContainerStarted","Data":"988bbc205a1350baa077393b997ca568bde16cd9b7c7a40885c2a283eb21d3c4"} Mar 10 00:19:36 crc kubenswrapper[4994]: I0310 00:19:36.029455 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" event={"ID":"08b7eb36-ad76-4d9a-9fe9-f37febcdfdab","Type":"ContainerStarted","Data":"d1577796553acf114f7080bb7c9e207c3708bc27c9e6f9f5a3983158ee3a7833"} Mar 10 00:19:36 crc kubenswrapper[4994]: I0310 00:19:36.030585 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" event={"ID":"22d07ce7-cdcc-4804-8127-a4f3a9d1685f","Type":"ContainerStarted","Data":"2816254c96065bbacbf1f79a20d126caa641e7b7d6c396eecc2b6928258b8f99"} Mar 10 00:19:36 crc kubenswrapper[4994]: I0310 00:19:36.031853 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" event={"ID":"65c6820f-4375-4de8-bcdf-0f0e2c4bcd87","Type":"ContainerStarted","Data":"3d6413977120c2da73c7d14d5d9dd5f31fb6a1df482e0cb53b4ed4214d80f951"} Mar 10 00:19:36 crc kubenswrapper[4994]: I0310 00:19:36.033495 4994 generic.go:334] "Generic (PLEG): container finished" podID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerID="a88bdcb4e90901bbacfd8ee24aa0a9a374eebc2ed6f9998f79f9db1119b33e20" exitCode=0 Mar 10 00:19:36 crc kubenswrapper[4994]: I0310 00:19:36.033539 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" event={"ID":"1b75d3a9-a107-4c28-afc2-7eb7e1357113","Type":"ContainerDied","Data":"a88bdcb4e90901bbacfd8ee24aa0a9a374eebc2ed6f9998f79f9db1119b33e20"} Mar 10 00:19:41 crc kubenswrapper[4994]: I0310 00:19:41.968839 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-m4pqx"] Mar 10 00:19:41 crc kubenswrapper[4994]: I0310 00:19:41.970152 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-m4pqx" Mar 10 00:19:41 crc kubenswrapper[4994]: I0310 00:19:41.972608 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-dppkh" Mar 10 00:19:41 crc kubenswrapper[4994]: I0310 00:19:41.972718 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Mar 10 00:19:41 crc kubenswrapper[4994]: I0310 00:19:41.980232 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Mar 10 00:19:41 crc kubenswrapper[4994]: I0310 00:19:41.986307 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-m4pqx"] Mar 10 00:19:42 crc kubenswrapper[4994]: I0310 00:19:42.069582 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb44d\" (UniqueName: \"kubernetes.io/projected/e54a8fd6-9ed8-42fd-bf63-b45930f9c54e-kube-api-access-fb44d\") pod \"interconnect-operator-5bb49f789d-m4pqx\" (UID: \"e54a8fd6-9ed8-42fd-bf63-b45930f9c54e\") " pod="service-telemetry/interconnect-operator-5bb49f789d-m4pqx" Mar 10 00:19:42 crc kubenswrapper[4994]: I0310 00:19:42.170774 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb44d\" (UniqueName: \"kubernetes.io/projected/e54a8fd6-9ed8-42fd-bf63-b45930f9c54e-kube-api-access-fb44d\") pod \"interconnect-operator-5bb49f789d-m4pqx\" (UID: \"e54a8fd6-9ed8-42fd-bf63-b45930f9c54e\") " pod="service-telemetry/interconnect-operator-5bb49f789d-m4pqx" Mar 10 00:19:42 crc kubenswrapper[4994]: I0310 00:19:42.194222 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb44d\" (UniqueName: \"kubernetes.io/projected/e54a8fd6-9ed8-42fd-bf63-b45930f9c54e-kube-api-access-fb44d\") pod \"interconnect-operator-5bb49f789d-m4pqx\" (UID: \"e54a8fd6-9ed8-42fd-bf63-b45930f9c54e\") " pod="service-telemetry/interconnect-operator-5bb49f789d-m4pqx" Mar 10 00:19:42 crc kubenswrapper[4994]: I0310 00:19:42.287672 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-m4pqx" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.062902 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-67966b6766-wzg6h"] Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.064473 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.066338 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.066805 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-4h5bj" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.122767 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-67966b6766-wzg6h"] Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.210780 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e02a24cb-8f25-4fd1-9f32-aa8ff3116662-webhook-cert\") pod \"elastic-operator-67966b6766-wzg6h\" (UID: \"e02a24cb-8f25-4fd1-9f32-aa8ff3116662\") " pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.210827 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e02a24cb-8f25-4fd1-9f32-aa8ff3116662-apiservice-cert\") pod \"elastic-operator-67966b6766-wzg6h\" (UID: \"e02a24cb-8f25-4fd1-9f32-aa8ff3116662\") " pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.210846 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzntx\" (UniqueName: \"kubernetes.io/projected/e02a24cb-8f25-4fd1-9f32-aa8ff3116662-kube-api-access-xzntx\") pod \"elastic-operator-67966b6766-wzg6h\" (UID: \"e02a24cb-8f25-4fd1-9f32-aa8ff3116662\") " pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.312560 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e02a24cb-8f25-4fd1-9f32-aa8ff3116662-webhook-cert\") pod \"elastic-operator-67966b6766-wzg6h\" (UID: \"e02a24cb-8f25-4fd1-9f32-aa8ff3116662\") " pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.312618 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e02a24cb-8f25-4fd1-9f32-aa8ff3116662-apiservice-cert\") pod \"elastic-operator-67966b6766-wzg6h\" (UID: \"e02a24cb-8f25-4fd1-9f32-aa8ff3116662\") " pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.312643 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzntx\" (UniqueName: \"kubernetes.io/projected/e02a24cb-8f25-4fd1-9f32-aa8ff3116662-kube-api-access-xzntx\") pod \"elastic-operator-67966b6766-wzg6h\" (UID: \"e02a24cb-8f25-4fd1-9f32-aa8ff3116662\") " pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.330791 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e02a24cb-8f25-4fd1-9f32-aa8ff3116662-webhook-cert\") pod \"elastic-operator-67966b6766-wzg6h\" (UID: \"e02a24cb-8f25-4fd1-9f32-aa8ff3116662\") " pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.331516 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzntx\" (UniqueName: \"kubernetes.io/projected/e02a24cb-8f25-4fd1-9f32-aa8ff3116662-kube-api-access-xzntx\") pod \"elastic-operator-67966b6766-wzg6h\" (UID: \"e02a24cb-8f25-4fd1-9f32-aa8ff3116662\") " pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.333485 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e02a24cb-8f25-4fd1-9f32-aa8ff3116662-apiservice-cert\") pod \"elastic-operator-67966b6766-wzg6h\" (UID: \"e02a24cb-8f25-4fd1-9f32-aa8ff3116662\") " pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:45 crc kubenswrapper[4994]: I0310 00:19:45.378682 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-67966b6766-wzg6h" Mar 10 00:19:50 crc kubenswrapper[4994]: E0310 00:19:50.109711 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8" Mar 10 00:19:50 crc kubenswrapper[4994]: E0310 00:19:50.110423 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:perses-operator,Image:registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openshift-service-ca,ReadOnly:true,MountPath:/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6cvd8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod perses-operator-5bf474d74f-gxbhj_openshift-operators(65c6820f-4375-4de8-bcdf-0f0e2c4bcd87): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:19:50 crc kubenswrapper[4994]: E0310 00:19:50.111935 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" podUID="65c6820f-4375-4de8-bcdf-0f0e2c4bcd87" Mar 10 00:19:50 crc kubenswrapper[4994]: E0310 00:19:50.152976 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea" Mar 10 00:19:50 crc kubenswrapper[4994]: E0310 00:19:50.153145 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r_openshift-operators(22d07ce7-cdcc-4804-8127-a4f3a9d1685f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:19:50 crc kubenswrapper[4994]: E0310 00:19:50.154329 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" podUID="22d07ce7-cdcc-4804-8127-a4f3a9d1685f" Mar 10 00:19:50 crc kubenswrapper[4994]: I0310 00:19:50.359691 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-m4pqx"] Mar 10 00:19:50 crc kubenswrapper[4994]: I0310 00:19:50.405672 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-67966b6766-wzg6h"] Mar 10 00:19:50 crc kubenswrapper[4994]: W0310 00:19:50.410006 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode02a24cb_8f25_4fd1_9f32_aa8ff3116662.slice/crio-5a5f406586dda641c509124dadd094637b3c5dba85bd13d1dd4468f6f0a6b4f8 WatchSource:0}: Error finding container 5a5f406586dda641c509124dadd094637b3c5dba85bd13d1dd4468f6f0a6b4f8: Status 404 returned error can't find the container with id 5a5f406586dda641c509124dadd094637b3c5dba85bd13d1dd4468f6f0a6b4f8 Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.125596 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-m4pqx" event={"ID":"e54a8fd6-9ed8-42fd-bf63-b45930f9c54e","Type":"ContainerStarted","Data":"3ad6c46f080b07e26711b4b60b1748b1ae30a55efd962a95a36c10006db6f04a"} Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.128371 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-67966b6766-wzg6h" event={"ID":"e02a24cb-8f25-4fd1-9f32-aa8ff3116662","Type":"ContainerStarted","Data":"5a5f406586dda641c509124dadd094637b3c5dba85bd13d1dd4468f6f0a6b4f8"} Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.129818 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" event={"ID":"08b7eb36-ad76-4d9a-9fe9-f37febcdfdab","Type":"ContainerStarted","Data":"26d19c6cb24f1a9a364cca37baf5c7e6bb8b73e0457d6853ca6f0fb944b29bdf"} Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.131318 4994 generic.go:334] "Generic (PLEG): container finished" podID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerID="2fab9f3f289f9b453d5428a5dfb3bb7f1306fe48c43ade59ef20e4832dc1272b" exitCode=0 Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.131375 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" event={"ID":"1b75d3a9-a107-4c28-afc2-7eb7e1357113","Type":"ContainerDied","Data":"2fab9f3f289f9b453d5428a5dfb3bb7f1306fe48c43ade59ef20e4832dc1272b"} Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.132859 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" event={"ID":"9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e","Type":"ContainerStarted","Data":"e339bb8b7cd9ef341b3fb2ddd08fb98ba5e3ce775ddf16513422bc6dc29ec284"} Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.133101 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.136759 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29" event={"ID":"13e52713-fbfe-43ba-ae51-b13a060d8a05","Type":"ContainerStarted","Data":"253c3e6dc952519cd8f613f2db84f9d830ae1d0b6975d697ace39024e527f56e"} Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.153179 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw" podStartSLOduration=2.42793204 podStartE2EDuration="17.153157882s" podCreationTimestamp="2026-03-10 00:19:34 +0000 UTC" firstStartedPulling="2026-03-10 00:19:35.446258806 +0000 UTC m=+789.619965555" lastFinishedPulling="2026-03-10 00:19:50.171484638 +0000 UTC m=+804.345191397" observedRunningTime="2026-03-10 00:19:51.145702661 +0000 UTC m=+805.319409420" watchObservedRunningTime="2026-03-10 00:19:51.153157882 +0000 UTC m=+805.326864631" Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.156289 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" Mar 10 00:19:51 crc kubenswrapper[4994]: E0310 00:19:51.157864 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8\\\"\"" pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" podUID="65c6820f-4375-4de8-bcdf-0f0e2c4bcd87" Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.200542 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-2jk2w" podStartSLOduration=2.5606218050000003 podStartE2EDuration="17.200529244s" podCreationTimestamp="2026-03-10 00:19:34 +0000 UTC" firstStartedPulling="2026-03-10 00:19:35.539995125 +0000 UTC m=+789.713701874" lastFinishedPulling="2026-03-10 00:19:50.179902564 +0000 UTC m=+804.353609313" observedRunningTime="2026-03-10 00:19:51.198516081 +0000 UTC m=+805.372222860" watchObservedRunningTime="2026-03-10 00:19:51.200529244 +0000 UTC m=+805.374235993" Mar 10 00:19:51 crc kubenswrapper[4994]: I0310 00:19:51.272586 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fnj29" podStartSLOduration=2.515916323 podStartE2EDuration="17.27256674s" podCreationTimestamp="2026-03-10 00:19:34 +0000 UTC" firstStartedPulling="2026-03-10 00:19:35.412856018 +0000 UTC m=+789.586562767" lastFinishedPulling="2026-03-10 00:19:50.169506425 +0000 UTC m=+804.343213184" observedRunningTime="2026-03-10 00:19:51.269982441 +0000 UTC m=+805.443689190" watchObservedRunningTime="2026-03-10 00:19:51.27256674 +0000 UTC m=+805.446273489" Mar 10 00:19:52 crc kubenswrapper[4994]: I0310 00:19:52.148771 4994 generic.go:334] "Generic (PLEG): container finished" podID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerID="360b49d34a22c59827c7cea4c2fc6406ccd0c8bf54363f66c38e3ae283b8a608" exitCode=0 Mar 10 00:19:52 crc kubenswrapper[4994]: I0310 00:19:52.148824 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" event={"ID":"1b75d3a9-a107-4c28-afc2-7eb7e1357113","Type":"ContainerDied","Data":"360b49d34a22c59827c7cea4c2fc6406ccd0c8bf54363f66c38e3ae283b8a608"} Mar 10 00:19:52 crc kubenswrapper[4994]: I0310 00:19:52.153321 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" event={"ID":"22d07ce7-cdcc-4804-8127-a4f3a9d1685f","Type":"ContainerStarted","Data":"30f1685ca52b28c6bfccb579464d4018c94422b4f82ed664aaadc8cd2d6056a4"} Mar 10 00:19:52 crc kubenswrapper[4994]: I0310 00:19:52.201006 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r" podStartSLOduration=-9223372018.653791 podStartE2EDuration="18.200984863s" podCreationTimestamp="2026-03-10 00:19:34 +0000 UTC" firstStartedPulling="2026-03-10 00:19:35.390396094 +0000 UTC m=+789.564102843" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:19:52.196915293 +0000 UTC m=+806.370622032" watchObservedRunningTime="2026-03-10 00:19:52.200984863 +0000 UTC m=+806.374691612" Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.640221 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.746265 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd5fn\" (UniqueName: \"kubernetes.io/projected/1b75d3a9-a107-4c28-afc2-7eb7e1357113-kube-api-access-bd5fn\") pod \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.746324 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-util\") pod \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.746384 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-bundle\") pod \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\" (UID: \"1b75d3a9-a107-4c28-afc2-7eb7e1357113\") " Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.747376 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-bundle" (OuterVolumeSpecName: "bundle") pod "1b75d3a9-a107-4c28-afc2-7eb7e1357113" (UID: "1b75d3a9-a107-4c28-afc2-7eb7e1357113"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.754076 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b75d3a9-a107-4c28-afc2-7eb7e1357113-kube-api-access-bd5fn" (OuterVolumeSpecName: "kube-api-access-bd5fn") pod "1b75d3a9-a107-4c28-afc2-7eb7e1357113" (UID: "1b75d3a9-a107-4c28-afc2-7eb7e1357113"). InnerVolumeSpecName "kube-api-access-bd5fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.757048 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-util" (OuterVolumeSpecName: "util") pod "1b75d3a9-a107-4c28-afc2-7eb7e1357113" (UID: "1b75d3a9-a107-4c28-afc2-7eb7e1357113"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.848227 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd5fn\" (UniqueName: \"kubernetes.io/projected/1b75d3a9-a107-4c28-afc2-7eb7e1357113-kube-api-access-bd5fn\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.848491 4994 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-util\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:54 crc kubenswrapper[4994]: I0310 00:19:54.848501 4994 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b75d3a9-a107-4c28-afc2-7eb7e1357113-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.188387 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" event={"ID":"1b75d3a9-a107-4c28-afc2-7eb7e1357113","Type":"ContainerDied","Data":"d73cec1d6e71b20e6e921ec34546a853007d278d08029837454b216700528bdf"} Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.188462 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d73cec1d6e71b20e6e921ec34546a853007d278d08029837454b216700528bdf" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.188558 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.190505 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-67966b6766-wzg6h" event={"ID":"e02a24cb-8f25-4fd1-9f32-aa8ff3116662","Type":"ContainerStarted","Data":"e8d5c3c3a60b753e86a20b2e01f2d523b1afc35c5ea6965bcf7699e80ba82b79"} Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.211835 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-67966b6766-wzg6h" podStartSLOduration=5.976363419 podStartE2EDuration="10.211816791s" podCreationTimestamp="2026-03-10 00:19:45 +0000 UTC" firstStartedPulling="2026-03-10 00:19:50.412602978 +0000 UTC m=+804.586309727" lastFinishedPulling="2026-03-10 00:19:54.64805635 +0000 UTC m=+808.821763099" observedRunningTime="2026-03-10 00:19:55.208476591 +0000 UTC m=+809.382183350" watchObservedRunningTime="2026-03-10 00:19:55.211816791 +0000 UTC m=+809.385523540" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.798750 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 10 00:19:55 crc kubenswrapper[4994]: E0310 00:19:55.799633 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerName="extract" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.799731 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerName="extract" Mar 10 00:19:55 crc kubenswrapper[4994]: E0310 00:19:55.799806 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerName="pull" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.799856 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerName="pull" Mar 10 00:19:55 crc kubenswrapper[4994]: E0310 00:19:55.799927 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerName="util" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.799976 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerName="util" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.800133 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b75d3a9-a107-4c28-afc2-7eb7e1357113" containerName="extract" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.800928 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.803583 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.803853 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.803961 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.804286 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-cw6vm" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.804330 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.804356 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.804442 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.804497 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.804531 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.858696 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.858961 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859046 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859205 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859247 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859273 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859300 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859326 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859352 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859393 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859416 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859438 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859457 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859487 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.859513 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.923879 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.960413 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.960693 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.960788 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.960895 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961002 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961134 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961229 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961302 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961365 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961435 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961504 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961569 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961617 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961705 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961810 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961921 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.962397 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.964320 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.964636 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.964902 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.966240 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.961841 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.966471 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.966471 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.970376 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.971037 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.971670 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.972250 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.974282 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:55 crc kubenswrapper[4994]: I0310 00:19:55.978611 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:56 crc kubenswrapper[4994]: I0310 00:19:56.116711 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:59 crc kubenswrapper[4994]: I0310 00:19:59.667260 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 10 00:19:59 crc kubenswrapper[4994]: W0310 00:19:59.669284 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e5e8d5b_ba04_461d_b7d0_98d90dd79fd7.slice/crio-296aa5ba18bedbb49c263d81e2e1853355ecf06134ea220bee7480951ed28306 WatchSource:0}: Error finding container 296aa5ba18bedbb49c263d81e2e1853355ecf06134ea220bee7480951ed28306: Status 404 returned error can't find the container with id 296aa5ba18bedbb49c263d81e2e1853355ecf06134ea220bee7480951ed28306 Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.130965 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551700-9pnx5"] Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.131897 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551700-9pnx5" Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.134584 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.134742 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.135110 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.139653 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551700-9pnx5"] Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.214445 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkmwt\" (UniqueName: \"kubernetes.io/projected/e07bbe9c-f27a-4256-82ba-3adc771e2ebd-kube-api-access-gkmwt\") pod \"auto-csr-approver-29551700-9pnx5\" (UID: \"e07bbe9c-f27a-4256-82ba-3adc771e2ebd\") " pod="openshift-infra/auto-csr-approver-29551700-9pnx5" Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.224117 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-m4pqx" event={"ID":"e54a8fd6-9ed8-42fd-bf63-b45930f9c54e","Type":"ContainerStarted","Data":"3c131e6c09642c6c06c93905255f6d907a55064ea977ae14f13372801517a5e4"} Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.225132 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7","Type":"ContainerStarted","Data":"296aa5ba18bedbb49c263d81e2e1853355ecf06134ea220bee7480951ed28306"} Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.240131 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-m4pqx" podStartSLOduration=10.065170466 podStartE2EDuration="19.2401095s" podCreationTimestamp="2026-03-10 00:19:41 +0000 UTC" firstStartedPulling="2026-03-10 00:19:50.377990388 +0000 UTC m=+804.551697137" lastFinishedPulling="2026-03-10 00:19:59.552929412 +0000 UTC m=+813.726636171" observedRunningTime="2026-03-10 00:20:00.236199156 +0000 UTC m=+814.409905905" watchObservedRunningTime="2026-03-10 00:20:00.2401095 +0000 UTC m=+814.413816249" Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.316198 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkmwt\" (UniqueName: \"kubernetes.io/projected/e07bbe9c-f27a-4256-82ba-3adc771e2ebd-kube-api-access-gkmwt\") pod \"auto-csr-approver-29551700-9pnx5\" (UID: \"e07bbe9c-f27a-4256-82ba-3adc771e2ebd\") " pod="openshift-infra/auto-csr-approver-29551700-9pnx5" Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.343311 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkmwt\" (UniqueName: \"kubernetes.io/projected/e07bbe9c-f27a-4256-82ba-3adc771e2ebd-kube-api-access-gkmwt\") pod \"auto-csr-approver-29551700-9pnx5\" (UID: \"e07bbe9c-f27a-4256-82ba-3adc771e2ebd\") " pod="openshift-infra/auto-csr-approver-29551700-9pnx5" Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.457535 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551700-9pnx5" Mar 10 00:20:00 crc kubenswrapper[4994]: I0310 00:20:00.707599 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551700-9pnx5"] Mar 10 00:20:01 crc kubenswrapper[4994]: I0310 00:20:01.234098 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551700-9pnx5" event={"ID":"e07bbe9c-f27a-4256-82ba-3adc771e2ebd","Type":"ContainerStarted","Data":"010910f36065ef6295367084341785cffbef39a2a3b7c8e3d664f1822e8f079a"} Mar 10 00:20:05 crc kubenswrapper[4994]: I0310 00:20:05.264058 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" event={"ID":"65c6820f-4375-4de8-bcdf-0f0e2c4bcd87","Type":"ContainerStarted","Data":"93abdcc9799949b43a61376d54507d7a8adbb2ce25ff7af94f044327956627a5"} Mar 10 00:20:05 crc kubenswrapper[4994]: I0310 00:20:05.264582 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:20:05 crc kubenswrapper[4994]: I0310 00:20:05.265631 4994 generic.go:334] "Generic (PLEG): container finished" podID="e07bbe9c-f27a-4256-82ba-3adc771e2ebd" containerID="7f305cfa821f31b484905a2d361cdfc46f777a5744001baccc6a559f30eb2409" exitCode=0 Mar 10 00:20:05 crc kubenswrapper[4994]: I0310 00:20:05.265675 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551700-9pnx5" event={"ID":"e07bbe9c-f27a-4256-82ba-3adc771e2ebd","Type":"ContainerDied","Data":"7f305cfa821f31b484905a2d361cdfc46f777a5744001baccc6a559f30eb2409"} Mar 10 00:20:05 crc kubenswrapper[4994]: I0310 00:20:05.288239 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" podStartSLOduration=1.385109356 podStartE2EDuration="30.288222222s" podCreationTimestamp="2026-03-10 00:19:35 +0000 UTC" firstStartedPulling="2026-03-10 00:19:35.632182612 +0000 UTC m=+789.805889361" lastFinishedPulling="2026-03-10 00:20:04.535295478 +0000 UTC m=+818.709002227" observedRunningTime="2026-03-10 00:20:05.28402493 +0000 UTC m=+819.457731699" watchObservedRunningTime="2026-03-10 00:20:05.288222222 +0000 UTC m=+819.461928981" Mar 10 00:20:06 crc kubenswrapper[4994]: I0310 00:20:06.651816 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551700-9pnx5" Mar 10 00:20:06 crc kubenswrapper[4994]: I0310 00:20:06.709252 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkmwt\" (UniqueName: \"kubernetes.io/projected/e07bbe9c-f27a-4256-82ba-3adc771e2ebd-kube-api-access-gkmwt\") pod \"e07bbe9c-f27a-4256-82ba-3adc771e2ebd\" (UID: \"e07bbe9c-f27a-4256-82ba-3adc771e2ebd\") " Mar 10 00:20:06 crc kubenswrapper[4994]: I0310 00:20:06.724096 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e07bbe9c-f27a-4256-82ba-3adc771e2ebd-kube-api-access-gkmwt" (OuterVolumeSpecName: "kube-api-access-gkmwt") pod "e07bbe9c-f27a-4256-82ba-3adc771e2ebd" (UID: "e07bbe9c-f27a-4256-82ba-3adc771e2ebd"). InnerVolumeSpecName "kube-api-access-gkmwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:20:06 crc kubenswrapper[4994]: I0310 00:20:06.811491 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkmwt\" (UniqueName: \"kubernetes.io/projected/e07bbe9c-f27a-4256-82ba-3adc771e2ebd-kube-api-access-gkmwt\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:07 crc kubenswrapper[4994]: I0310 00:20:07.278793 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551700-9pnx5" event={"ID":"e07bbe9c-f27a-4256-82ba-3adc771e2ebd","Type":"ContainerDied","Data":"010910f36065ef6295367084341785cffbef39a2a3b7c8e3d664f1822e8f079a"} Mar 10 00:20:07 crc kubenswrapper[4994]: I0310 00:20:07.278831 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="010910f36065ef6295367084341785cffbef39a2a3b7c8e3d664f1822e8f079a" Mar 10 00:20:07 crc kubenswrapper[4994]: I0310 00:20:07.278852 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551700-9pnx5" Mar 10 00:20:07 crc kubenswrapper[4994]: I0310 00:20:07.720199 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551694-pngfv"] Mar 10 00:20:07 crc kubenswrapper[4994]: I0310 00:20:07.724259 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551694-pngfv"] Mar 10 00:20:08 crc kubenswrapper[4994]: I0310 00:20:08.564568 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e91ae1c5-3f03-4439-b579-b828884a1b58" path="/var/lib/kubelet/pods/e91ae1c5-3f03-4439-b579-b828884a1b58/volumes" Mar 10 00:20:09 crc kubenswrapper[4994]: I0310 00:20:09.837912 4994 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 00:20:11 crc kubenswrapper[4994]: I0310 00:20:11.836048 4994 scope.go:117] "RemoveContainer" containerID="e7fca198849ed5918e32d90439680c268c9d2e25eff70cc1e2ce92503bc67d85" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.722932 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx"] Mar 10 00:20:12 crc kubenswrapper[4994]: E0310 00:20:12.723536 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e07bbe9c-f27a-4256-82ba-3adc771e2ebd" containerName="oc" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.723552 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07bbe9c-f27a-4256-82ba-3adc771e2ebd" containerName="oc" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.723701 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="e07bbe9c-f27a-4256-82ba-3adc771e2ebd" containerName="oc" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.724231 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.725752 4994 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-5rlc4" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.726358 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.726527 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.750121 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx"] Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.802754 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcl77\" (UniqueName: \"kubernetes.io/projected/47cdbf6c-4d30-40c5-8af9-6685cd711b7a-kube-api-access-tcl77\") pod \"cert-manager-operator-controller-manager-5586865c96-jzlcx\" (UID: \"47cdbf6c-4d30-40c5-8af9-6685cd711b7a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.802983 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47cdbf6c-4d30-40c5-8af9-6685cd711b7a-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-jzlcx\" (UID: \"47cdbf6c-4d30-40c5-8af9-6685cd711b7a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.904173 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcl77\" (UniqueName: \"kubernetes.io/projected/47cdbf6c-4d30-40c5-8af9-6685cd711b7a-kube-api-access-tcl77\") pod \"cert-manager-operator-controller-manager-5586865c96-jzlcx\" (UID: \"47cdbf6c-4d30-40c5-8af9-6685cd711b7a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.904277 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47cdbf6c-4d30-40c5-8af9-6685cd711b7a-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-jzlcx\" (UID: \"47cdbf6c-4d30-40c5-8af9-6685cd711b7a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.904729 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/47cdbf6c-4d30-40c5-8af9-6685cd711b7a-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-jzlcx\" (UID: \"47cdbf6c-4d30-40c5-8af9-6685cd711b7a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" Mar 10 00:20:12 crc kubenswrapper[4994]: I0310 00:20:12.927977 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcl77\" (UniqueName: \"kubernetes.io/projected/47cdbf6c-4d30-40c5-8af9-6685cd711b7a-kube-api-access-tcl77\") pod \"cert-manager-operator-controller-manager-5586865c96-jzlcx\" (UID: \"47cdbf6c-4d30-40c5-8af9-6685cd711b7a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" Mar 10 00:20:13 crc kubenswrapper[4994]: I0310 00:20:13.049083 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" Mar 10 00:20:15 crc kubenswrapper[4994]: I0310 00:20:15.436071 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-gxbhj" Mar 10 00:20:20 crc kubenswrapper[4994]: I0310 00:20:20.195593 4994 scope.go:117] "RemoveContainer" containerID="ddb1ff554509065a0634194f412b0e90319b501a3735bf3cda900f518d12f147" Mar 10 00:20:20 crc kubenswrapper[4994]: I0310 00:20:20.762798 4994 scope.go:117] "RemoveContainer" containerID="ef1f80910f9e65f34790675bdb343fd6ffef0cbe9f22353df047c95ba63843ec" Mar 10 00:20:21 crc kubenswrapper[4994]: E0310 00:20:21.377691 4994 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="registry.connect.redhat.com/elastic/elasticsearch:7.17.20" Mar 10 00:20:21 crc kubenswrapper[4994]: E0310 00:20:21.378285 4994 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:elastic-internal-init-filesystem,Image:registry.connect.redhat.com/elastic/elasticsearch:7.17.20,Command:[bash -c /mnt/elastic-internal/scripts/prepare-fs.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:HEADLESS_SERVICE_NAME,Value:elasticsearch-es-default,ValueFrom:nil,},EnvVar{Name:PROBE_PASSWORD_PATH,Value:/mnt/elastic-internal/pod-mounted-users/elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:PROBE_USERNAME,Value:elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:READINESS_PROBE_PROTOCOL,Value:https,ValueFrom:nil,},EnvVar{Name:NSS_SDB_USE_CACHE,Value:no,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:downward-api,ReadOnly:true,MountPath:/mnt/elastic-internal/downward-api,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-bin-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-bin-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config,ReadOnly:true,MountPath:/mnt/elastic-internal/elasticsearch-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-config-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-plugins-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-plugins-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-http-certificates,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/http-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-probe-user,ReadOnly:true,MountPath:/mnt/elastic-internal/pod-mounted-users,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-remote-certificate-authorities,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/transport-remote-certs/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-scripts,ReadOnly:true,MountPath:/mnt/elastic-internal/scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-transport-certificates,ReadOnly:true,MountPath:/mnt/elastic-internal/transport-certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-unicast-hosts,ReadOnly:true,MountPath:/mnt/elastic-internal/unicast-hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-xpack-file-realm,ReadOnly:true,MountPath:/mnt/elastic-internal/xpack-file-realm,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-data,ReadOnly:false,MountPath:/usr/share/elasticsearch/data,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-logs,ReadOnly:false,MountPath:/usr/share/elasticsearch/logs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tmp-volume,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod elasticsearch-es-default-0_service-telemetry(4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 00:20:21 crc kubenswrapper[4994]: E0310 00:20:21.379515 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7" Mar 10 00:20:21 crc kubenswrapper[4994]: I0310 00:20:21.586197 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx"] Mar 10 00:20:21 crc kubenswrapper[4994]: W0310 00:20:21.598527 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47cdbf6c_4d30_40c5_8af9_6685cd711b7a.slice/crio-930ca6ed0337859ec704828a7f5c8d2fc8b22ef59722422e4a262eeb62fb2134 WatchSource:0}: Error finding container 930ca6ed0337859ec704828a7f5c8d2fc8b22ef59722422e4a262eeb62fb2134: Status 404 returned error can't find the container with id 930ca6ed0337859ec704828a7f5c8d2fc8b22ef59722422e4a262eeb62fb2134 Mar 10 00:20:21 crc kubenswrapper[4994]: I0310 00:20:21.601314 4994 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 00:20:22 crc kubenswrapper[4994]: I0310 00:20:22.374980 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" event={"ID":"47cdbf6c-4d30-40c5-8af9-6685cd711b7a","Type":"ContainerStarted","Data":"930ca6ed0337859ec704828a7f5c8d2fc8b22ef59722422e4a262eeb62fb2134"} Mar 10 00:20:22 crc kubenswrapper[4994]: E0310 00:20:22.377291 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7" Mar 10 00:20:22 crc kubenswrapper[4994]: I0310 00:20:22.495643 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 10 00:20:22 crc kubenswrapper[4994]: I0310 00:20:22.528520 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 10 00:20:23 crc kubenswrapper[4994]: E0310 00:20:23.382186 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7" Mar 10 00:20:24 crc kubenswrapper[4994]: E0310 00:20:24.423046 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7" Mar 10 00:20:25 crc kubenswrapper[4994]: I0310 00:20:25.398426 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" event={"ID":"47cdbf6c-4d30-40c5-8af9-6685cd711b7a","Type":"ContainerStarted","Data":"7331001565f4232abe5c69d7691e595fa5f9b278916200892cff4d382d0f652d"} Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.165139 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jzlcx" podStartSLOduration=14.329187293 podStartE2EDuration="17.165124292s" podCreationTimestamp="2026-03-10 00:20:12 +0000 UTC" firstStartedPulling="2026-03-10 00:20:21.601108994 +0000 UTC m=+835.774815743" lastFinishedPulling="2026-03-10 00:20:24.437045993 +0000 UTC m=+838.610752742" observedRunningTime="2026-03-10 00:20:25.431302984 +0000 UTC m=+839.605009743" watchObservedRunningTime="2026-03-10 00:20:29.165124292 +0000 UTC m=+843.338831041" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.168952 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-6qgfs"] Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.169583 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.171062 4994 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-vzkn9" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.173238 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.173365 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.180852 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-6qgfs"] Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.231722 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef38e78a-b3a6-4de7-ba46-598693edf905-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-6qgfs\" (UID: \"ef38e78a-b3a6-4de7-ba46-598693edf905\") " pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.231805 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfb4m\" (UniqueName: \"kubernetes.io/projected/ef38e78a-b3a6-4de7-ba46-598693edf905-kube-api-access-kfb4m\") pod \"cert-manager-webhook-6888856db4-6qgfs\" (UID: \"ef38e78a-b3a6-4de7-ba46-598693edf905\") " pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.342271 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef38e78a-b3a6-4de7-ba46-598693edf905-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-6qgfs\" (UID: \"ef38e78a-b3a6-4de7-ba46-598693edf905\") " pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.342348 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfb4m\" (UniqueName: \"kubernetes.io/projected/ef38e78a-b3a6-4de7-ba46-598693edf905-kube-api-access-kfb4m\") pod \"cert-manager-webhook-6888856db4-6qgfs\" (UID: \"ef38e78a-b3a6-4de7-ba46-598693edf905\") " pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.363974 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfb4m\" (UniqueName: \"kubernetes.io/projected/ef38e78a-b3a6-4de7-ba46-598693edf905-kube-api-access-kfb4m\") pod \"cert-manager-webhook-6888856db4-6qgfs\" (UID: \"ef38e78a-b3a6-4de7-ba46-598693edf905\") " pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.371384 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef38e78a-b3a6-4de7-ba46-598693edf905-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-6qgfs\" (UID: \"ef38e78a-b3a6-4de7-ba46-598693edf905\") " pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.486387 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:29 crc kubenswrapper[4994]: I0310 00:20:29.853768 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-6qgfs"] Mar 10 00:20:29 crc kubenswrapper[4994]: W0310 00:20:29.865118 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef38e78a_b3a6_4de7_ba46_598693edf905.slice/crio-19de56151c60af0a6346cfa30b5a994bcf4929f49c8203a3eb8905279f118ed4 WatchSource:0}: Error finding container 19de56151c60af0a6346cfa30b5a994bcf4929f49c8203a3eb8905279f118ed4: Status 404 returned error can't find the container with id 19de56151c60af0a6346cfa30b5a994bcf4929f49c8203a3eb8905279f118ed4 Mar 10 00:20:30 crc kubenswrapper[4994]: I0310 00:20:30.431232 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" event={"ID":"ef38e78a-b3a6-4de7-ba46-598693edf905","Type":"ContainerStarted","Data":"19de56151c60af0a6346cfa30b5a994bcf4929f49c8203a3eb8905279f118ed4"} Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.933232 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.934196 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.936815 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.936903 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.936997 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.944367 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.956552 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.964269 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-8qd55"] Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.964966 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.968406 4994 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-n56gh" Mar 10 00:20:31 crc kubenswrapper[4994]: I0310 00:20:31.992076 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-8qd55"] Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.075837 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.075902 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.075934 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.075963 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076024 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5kq7\" (UniqueName: \"kubernetes.io/projected/c5f1e9a7-bff0-4565-9cef-d8904908dbfe-kube-api-access-x5kq7\") pod \"cert-manager-cainjector-5545bd876-8qd55\" (UID: \"c5f1e9a7-bff0-4565-9cef-d8904908dbfe\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076048 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076063 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076091 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t58hq\" (UniqueName: \"kubernetes.io/projected/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-kube-api-access-t58hq\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076113 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f1e9a7-bff0-4565-9cef-d8904908dbfe-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-8qd55\" (UID: \"c5f1e9a7-bff0-4565-9cef-d8904908dbfe\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076138 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076156 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076200 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076219 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.076275 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177300 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177360 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177392 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177435 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5kq7\" (UniqueName: \"kubernetes.io/projected/c5f1e9a7-bff0-4565-9cef-d8904908dbfe-kube-api-access-x5kq7\") pod \"cert-manager-cainjector-5545bd876-8qd55\" (UID: \"c5f1e9a7-bff0-4565-9cef-d8904908dbfe\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177461 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177477 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177506 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t58hq\" (UniqueName: \"kubernetes.io/projected/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-kube-api-access-t58hq\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177528 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f1e9a7-bff0-4565-9cef-d8904908dbfe-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-8qd55\" (UID: \"c5f1e9a7-bff0-4565-9cef-d8904908dbfe\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177553 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177573 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177600 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177621 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177629 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177638 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177694 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.177779 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.178303 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.178441 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.178618 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.178716 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.178925 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.179133 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.179218 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.183035 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.183749 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.200377 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t58hq\" (UniqueName: \"kubernetes.io/projected/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-kube-api-access-t58hq\") pod \"service-telemetry-operator-1-build\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.206447 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5kq7\" (UniqueName: \"kubernetes.io/projected/c5f1e9a7-bff0-4565-9cef-d8904908dbfe-kube-api-access-x5kq7\") pod \"cert-manager-cainjector-5545bd876-8qd55\" (UID: \"c5f1e9a7-bff0-4565-9cef-d8904908dbfe\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.207099 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f1e9a7-bff0-4565-9cef-d8904908dbfe-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-8qd55\" (UID: \"c5f1e9a7-bff0-4565-9cef-d8904908dbfe\") " pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.258757 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.288030 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.677571 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 10 00:20:32 crc kubenswrapper[4994]: W0310 00:20:32.680586 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6fc7b86_bebf_4721_a8c1_88169e4ec64e.slice/crio-393421dea70664559a45ce4c485c1e4b38594dd2141443b113ceecaeb78599d7 WatchSource:0}: Error finding container 393421dea70664559a45ce4c485c1e4b38594dd2141443b113ceecaeb78599d7: Status 404 returned error can't find the container with id 393421dea70664559a45ce4c485c1e4b38594dd2141443b113ceecaeb78599d7 Mar 10 00:20:32 crc kubenswrapper[4994]: I0310 00:20:32.726595 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-8qd55"] Mar 10 00:20:32 crc kubenswrapper[4994]: W0310 00:20:32.730150 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5f1e9a7_bff0_4565_9cef_d8904908dbfe.slice/crio-0d070f4f7280c5a4bbb9dfba3e1b7cffa5d7980f21d312f77a2c73001fc4394a WatchSource:0}: Error finding container 0d070f4f7280c5a4bbb9dfba3e1b7cffa5d7980f21d312f77a2c73001fc4394a: Status 404 returned error can't find the container with id 0d070f4f7280c5a4bbb9dfba3e1b7cffa5d7980f21d312f77a2c73001fc4394a Mar 10 00:20:33 crc kubenswrapper[4994]: I0310 00:20:33.453370 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"a6fc7b86-bebf-4721-a8c1-88169e4ec64e","Type":"ContainerStarted","Data":"393421dea70664559a45ce4c485c1e4b38594dd2141443b113ceecaeb78599d7"} Mar 10 00:20:33 crc kubenswrapper[4994]: I0310 00:20:33.455023 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" event={"ID":"c5f1e9a7-bff0-4565-9cef-d8904908dbfe","Type":"ContainerStarted","Data":"0d070f4f7280c5a4bbb9dfba3e1b7cffa5d7980f21d312f77a2c73001fc4394a"} Mar 10 00:20:35 crc kubenswrapper[4994]: I0310 00:20:35.467899 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" event={"ID":"ef38e78a-b3a6-4de7-ba46-598693edf905","Type":"ContainerStarted","Data":"42aaa159671125a9e9bf07e6b24b5abfcafb509ce8769303c9d9b2f61858021a"} Mar 10 00:20:35 crc kubenswrapper[4994]: I0310 00:20:35.468337 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:35 crc kubenswrapper[4994]: I0310 00:20:35.468979 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" event={"ID":"c5f1e9a7-bff0-4565-9cef-d8904908dbfe","Type":"ContainerStarted","Data":"25f5126b3cbd60397c83cb729a3dc4b2450515dd72b0b24108d7d39bdc9ceb26"} Mar 10 00:20:35 crc kubenswrapper[4994]: I0310 00:20:35.485722 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" podStartSLOduration=1.519395747 podStartE2EDuration="6.485702052s" podCreationTimestamp="2026-03-10 00:20:29 +0000 UTC" firstStartedPulling="2026-03-10 00:20:29.87408083 +0000 UTC m=+844.047787579" lastFinishedPulling="2026-03-10 00:20:34.840387145 +0000 UTC m=+849.014093884" observedRunningTime="2026-03-10 00:20:35.484947452 +0000 UTC m=+849.658654241" watchObservedRunningTime="2026-03-10 00:20:35.485702052 +0000 UTC m=+849.659408801" Mar 10 00:20:35 crc kubenswrapper[4994]: I0310 00:20:35.514612 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-8qd55" podStartSLOduration=2.426832846 podStartE2EDuration="4.514588999s" podCreationTimestamp="2026-03-10 00:20:31 +0000 UTC" firstStartedPulling="2026-03-10 00:20:32.732317036 +0000 UTC m=+846.906023785" lastFinishedPulling="2026-03-10 00:20:34.820073159 +0000 UTC m=+848.993779938" observedRunningTime="2026-03-10 00:20:35.505243448 +0000 UTC m=+849.678950197" watchObservedRunningTime="2026-03-10 00:20:35.514588999 +0000 UTC m=+849.688295748" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.489204 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-6qgfs" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.577470 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-jjkfq"] Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.579421 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-jjkfq" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.582522 4994 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-r8kvs" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.590529 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-jjkfq"] Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.685261 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/943085e6-2580-48ae-9c2d-d83989c6204c-bound-sa-token\") pod \"cert-manager-545d4d4674-jjkfq\" (UID: \"943085e6-2580-48ae-9c2d-d83989c6204c\") " pod="cert-manager/cert-manager-545d4d4674-jjkfq" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.685393 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5z9b\" (UniqueName: \"kubernetes.io/projected/943085e6-2580-48ae-9c2d-d83989c6204c-kube-api-access-d5z9b\") pod \"cert-manager-545d4d4674-jjkfq\" (UID: \"943085e6-2580-48ae-9c2d-d83989c6204c\") " pod="cert-manager/cert-manager-545d4d4674-jjkfq" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.787155 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5z9b\" (UniqueName: \"kubernetes.io/projected/943085e6-2580-48ae-9c2d-d83989c6204c-kube-api-access-d5z9b\") pod \"cert-manager-545d4d4674-jjkfq\" (UID: \"943085e6-2580-48ae-9c2d-d83989c6204c\") " pod="cert-manager/cert-manager-545d4d4674-jjkfq" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.787366 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/943085e6-2580-48ae-9c2d-d83989c6204c-bound-sa-token\") pod \"cert-manager-545d4d4674-jjkfq\" (UID: \"943085e6-2580-48ae-9c2d-d83989c6204c\") " pod="cert-manager/cert-manager-545d4d4674-jjkfq" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.807803 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/943085e6-2580-48ae-9c2d-d83989c6204c-bound-sa-token\") pod \"cert-manager-545d4d4674-jjkfq\" (UID: \"943085e6-2580-48ae-9c2d-d83989c6204c\") " pod="cert-manager/cert-manager-545d4d4674-jjkfq" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.814540 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5z9b\" (UniqueName: \"kubernetes.io/projected/943085e6-2580-48ae-9c2d-d83989c6204c-kube-api-access-d5z9b\") pod \"cert-manager-545d4d4674-jjkfq\" (UID: \"943085e6-2580-48ae-9c2d-d83989c6204c\") " pod="cert-manager/cert-manager-545d4d4674-jjkfq" Mar 10 00:20:39 crc kubenswrapper[4994]: I0310 00:20:39.903960 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-jjkfq" Mar 10 00:20:40 crc kubenswrapper[4994]: I0310 00:20:40.716461 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-jjkfq"] Mar 10 00:20:40 crc kubenswrapper[4994]: W0310 00:20:40.724111 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod943085e6_2580_48ae_9c2d_d83989c6204c.slice/crio-f37acacd45227243fd294bf3fa4e8c7791f9bdef5804af28f1c92bf0d5db623d WatchSource:0}: Error finding container f37acacd45227243fd294bf3fa4e8c7791f9bdef5804af28f1c92bf0d5db623d: Status 404 returned error can't find the container with id f37acacd45227243fd294bf3fa4e8c7791f9bdef5804af28f1c92bf0d5db623d Mar 10 00:20:41 crc kubenswrapper[4994]: I0310 00:20:41.510486 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7","Type":"ContainerStarted","Data":"58298e7bf03a06c8764bfd97d0b2c25a7a168fdf40a89f470e9683bafd16507e"} Mar 10 00:20:41 crc kubenswrapper[4994]: I0310 00:20:41.512422 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-jjkfq" event={"ID":"943085e6-2580-48ae-9c2d-d83989c6204c","Type":"ContainerStarted","Data":"66d5b3e4394353a801c9f18585ccca284d9493d9da7ad89a57addd3971112512"} Mar 10 00:20:41 crc kubenswrapper[4994]: I0310 00:20:41.512462 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-jjkfq" event={"ID":"943085e6-2580-48ae-9c2d-d83989c6204c","Type":"ContainerStarted","Data":"f37acacd45227243fd294bf3fa4e8c7791f9bdef5804af28f1c92bf0d5db623d"} Mar 10 00:20:41 crc kubenswrapper[4994]: I0310 00:20:41.514129 4994 generic.go:334] "Generic (PLEG): container finished" podID="a6fc7b86-bebf-4721-a8c1-88169e4ec64e" containerID="9cca42badce94fad376287336f15b62eab0ecbaa17338ad7903e0ad928102f3f" exitCode=0 Mar 10 00:20:41 crc kubenswrapper[4994]: I0310 00:20:41.514160 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"a6fc7b86-bebf-4721-a8c1-88169e4ec64e","Type":"ContainerDied","Data":"9cca42badce94fad376287336f15b62eab0ecbaa17338ad7903e0ad928102f3f"} Mar 10 00:20:41 crc kubenswrapper[4994]: I0310 00:20:41.595999 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-jjkfq" podStartSLOduration=2.595975096 podStartE2EDuration="2.595975096s" podCreationTimestamp="2026-03-10 00:20:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:20:41.594015343 +0000 UTC m=+855.767722142" watchObservedRunningTime="2026-03-10 00:20:41.595975096 +0000 UTC m=+855.769681865" Mar 10 00:20:42 crc kubenswrapper[4994]: I0310 00:20:42.000521 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 10 00:20:42 crc kubenswrapper[4994]: I0310 00:20:42.529231 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"a6fc7b86-bebf-4721-a8c1-88169e4ec64e","Type":"ContainerStarted","Data":"578327f2ef85774f35fe8c4237ddd2aa89ef148dfeb38a27116f7a82f23c231b"} Mar 10 00:20:42 crc kubenswrapper[4994]: I0310 00:20:42.531710 4994 generic.go:334] "Generic (PLEG): container finished" podID="4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7" containerID="58298e7bf03a06c8764bfd97d0b2c25a7a168fdf40a89f470e9683bafd16507e" exitCode=0 Mar 10 00:20:42 crc kubenswrapper[4994]: I0310 00:20:42.531820 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7","Type":"ContainerDied","Data":"58298e7bf03a06c8764bfd97d0b2c25a7a168fdf40a89f470e9683bafd16507e"} Mar 10 00:20:42 crc kubenswrapper[4994]: I0310 00:20:42.570028 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-1-build" podStartSLOduration=3.869946832 podStartE2EDuration="11.570000664s" podCreationTimestamp="2026-03-10 00:20:31 +0000 UTC" firstStartedPulling="2026-03-10 00:20:32.684270544 +0000 UTC m=+846.857977293" lastFinishedPulling="2026-03-10 00:20:40.384324376 +0000 UTC m=+854.558031125" observedRunningTime="2026-03-10 00:20:42.568538174 +0000 UTC m=+856.742244983" watchObservedRunningTime="2026-03-10 00:20:42.570000664 +0000 UTC m=+856.743707453" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.538497 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="a6fc7b86-bebf-4721-a8c1-88169e4ec64e" containerName="docker-build" containerID="cri-o://578327f2ef85774f35fe8c4237ddd2aa89ef148dfeb38a27116f7a82f23c231b" gracePeriod=30 Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.686750 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.687809 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.693350 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.693480 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.693530 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.722438 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.751663 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.751720 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.751758 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.751781 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.751814 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.751911 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.751940 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.751963 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdv98\" (UniqueName: \"kubernetes.io/projected/46c9545f-e40a-413b-834d-b428cedc906b-kube-api-access-pdv98\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.752030 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.752055 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.752592 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.752720 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853567 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853612 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853632 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853655 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853676 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853692 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853711 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853727 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853747 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853776 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853791 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.853807 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdv98\" (UniqueName: \"kubernetes.io/projected/46c9545f-e40a-413b-834d-b428cedc906b-kube-api-access-pdv98\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.854009 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.854026 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.854466 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.854583 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.854700 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.854801 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.855108 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.855259 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.855360 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.867035 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.867274 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:43 crc kubenswrapper[4994]: I0310 00:20:43.868635 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdv98\" (UniqueName: \"kubernetes.io/projected/46c9545f-e40a-413b-834d-b428cedc906b-kube-api-access-pdv98\") pod \"service-telemetry-operator-2-build\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:44 crc kubenswrapper[4994]: I0310 00:20:44.001992 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:46 crc kubenswrapper[4994]: I0310 00:20:46.386800 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 10 00:20:46 crc kubenswrapper[4994]: I0310 00:20:46.559982 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"46c9545f-e40a-413b-834d-b428cedc906b","Type":"ContainerStarted","Data":"d087a7e67b101576e99855a78361be7581843febff904371846d8544b91e846f"} Mar 10 00:20:48 crc kubenswrapper[4994]: I0310 00:20:48.592051 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"46c9545f-e40a-413b-834d-b428cedc906b","Type":"ContainerStarted","Data":"5f529cfcb855ffbe234aadb07f23381b9405d00ba132dfa647c6f38b1781ad1a"} Mar 10 00:20:48 crc kubenswrapper[4994]: I0310 00:20:48.594689 4994 generic.go:334] "Generic (PLEG): container finished" podID="4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7" containerID="54db6cc7ea3d8404e3f2f0fb896192a9783811677c1a276c79cd3f8eb405c3de" exitCode=0 Mar 10 00:20:48 crc kubenswrapper[4994]: I0310 00:20:48.594737 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7","Type":"ContainerDied","Data":"54db6cc7ea3d8404e3f2f0fb896192a9783811677c1a276c79cd3f8eb405c3de"} Mar 10 00:20:48 crc kubenswrapper[4994]: I0310 00:20:48.893254 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:20:48 crc kubenswrapper[4994]: I0310 00:20:48.894030 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:20:49 crc kubenswrapper[4994]: I0310 00:20:49.610065 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_a6fc7b86-bebf-4721-a8c1-88169e4ec64e/docker-build/0.log" Mar 10 00:20:49 crc kubenswrapper[4994]: I0310 00:20:49.611356 4994 generic.go:334] "Generic (PLEG): container finished" podID="a6fc7b86-bebf-4721-a8c1-88169e4ec64e" containerID="578327f2ef85774f35fe8c4237ddd2aa89ef148dfeb38a27116f7a82f23c231b" exitCode=1 Mar 10 00:20:49 crc kubenswrapper[4994]: I0310 00:20:49.611469 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"a6fc7b86-bebf-4721-a8c1-88169e4ec64e","Type":"ContainerDied","Data":"578327f2ef85774f35fe8c4237ddd2aa89ef148dfeb38a27116f7a82f23c231b"} Mar 10 00:20:49 crc kubenswrapper[4994]: I0310 00:20:49.616212 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7","Type":"ContainerStarted","Data":"d5ebb9902f4cc566f26df996f576738a2abe32f828dc16e6f469eda8c02270a9"} Mar 10 00:20:49 crc kubenswrapper[4994]: I0310 00:20:49.616499 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:20:49 crc kubenswrapper[4994]: I0310 00:20:49.707124 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=14.025372242 podStartE2EDuration="54.707102824s" podCreationTimestamp="2026-03-10 00:19:55 +0000 UTC" firstStartedPulling="2026-03-10 00:19:59.671642702 +0000 UTC m=+813.845349461" lastFinishedPulling="2026-03-10 00:20:40.353373284 +0000 UTC m=+854.527080043" observedRunningTime="2026-03-10 00:20:49.694640869 +0000 UTC m=+863.868347628" watchObservedRunningTime="2026-03-10 00:20:49.707102824 +0000 UTC m=+863.880809583" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.003327 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_a6fc7b86-bebf-4721-a8c1-88169e4ec64e/docker-build/0.log" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.004193 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161080 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-blob-cache\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161120 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-run\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161150 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-node-pullsecrets\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161181 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildworkdir\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161212 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-push\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161260 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161276 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t58hq\" (UniqueName: \"kubernetes.io/projected/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-kube-api-access-t58hq\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161310 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildcachedir\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161345 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-root\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161385 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-pull\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161413 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-ca-bundles\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161436 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-proxy-ca-bundles\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161458 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-system-configs\") pod \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\" (UID: \"a6fc7b86-bebf-4721-a8c1-88169e4ec64e\") " Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161615 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161785 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161849 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161847 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.161883 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.162204 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.162338 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.162365 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.162615 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.162815 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.166756 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.167037 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-kube-api-access-t58hq" (OuterVolumeSpecName: "kube-api-access-t58hq") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "kube-api-access-t58hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.167348 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "a6fc7b86-bebf-4721-a8c1-88169e4ec64e" (UID: "a6fc7b86-bebf-4721-a8c1-88169e4ec64e"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263145 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263187 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263204 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263214 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263229 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263238 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263250 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263262 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263270 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.263278 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t58hq\" (UniqueName: \"kubernetes.io/projected/a6fc7b86-bebf-4721-a8c1-88169e4ec64e-kube-api-access-t58hq\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.622396 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_a6fc7b86-bebf-4721-a8c1-88169e4ec64e/docker-build/0.log" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.623127 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.623127 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"a6fc7b86-bebf-4721-a8c1-88169e4ec64e","Type":"ContainerDied","Data":"393421dea70664559a45ce4c485c1e4b38594dd2141443b113ceecaeb78599d7"} Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.623190 4994 scope.go:117] "RemoveContainer" containerID="578327f2ef85774f35fe8c4237ddd2aa89ef148dfeb38a27116f7a82f23c231b" Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.653132 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.656577 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 10 00:20:50 crc kubenswrapper[4994]: I0310 00:20:50.662007 4994 scope.go:117] "RemoveContainer" containerID="9cca42badce94fad376287336f15b62eab0ecbaa17338ad7903e0ad928102f3f" Mar 10 00:20:52 crc kubenswrapper[4994]: I0310 00:20:52.568614 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6fc7b86-bebf-4721-a8c1-88169e4ec64e" path="/var/lib/kubelet/pods/a6fc7b86-bebf-4721-a8c1-88169e4ec64e/volumes" Mar 10 00:20:59 crc kubenswrapper[4994]: I0310 00:20:59.710272 4994 generic.go:334] "Generic (PLEG): container finished" podID="46c9545f-e40a-413b-834d-b428cedc906b" containerID="5f529cfcb855ffbe234aadb07f23381b9405d00ba132dfa647c6f38b1781ad1a" exitCode=0 Mar 10 00:20:59 crc kubenswrapper[4994]: I0310 00:20:59.710909 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"46c9545f-e40a-413b-834d-b428cedc906b","Type":"ContainerDied","Data":"5f529cfcb855ffbe234aadb07f23381b9405d00ba132dfa647c6f38b1781ad1a"} Mar 10 00:21:00 crc kubenswrapper[4994]: I0310 00:21:00.722263 4994 generic.go:334] "Generic (PLEG): container finished" podID="46c9545f-e40a-413b-834d-b428cedc906b" containerID="6c574b09f77fce2d6065cbbc628a9fe5b857fe1528785e7e247d3386857f7208" exitCode=0 Mar 10 00:21:00 crc kubenswrapper[4994]: I0310 00:21:00.722333 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"46c9545f-e40a-413b-834d-b428cedc906b","Type":"ContainerDied","Data":"6c574b09f77fce2d6065cbbc628a9fe5b857fe1528785e7e247d3386857f7208"} Mar 10 00:21:00 crc kubenswrapper[4994]: I0310 00:21:00.793189 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_46c9545f-e40a-413b-834d-b428cedc906b/manage-dockerfile/0.log" Mar 10 00:21:01 crc kubenswrapper[4994]: I0310 00:21:01.252550 4994 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7" containerName="elasticsearch" probeResult="failure" output=< Mar 10 00:21:01 crc kubenswrapper[4994]: {"timestamp": "2026-03-10T00:21:01+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 10 00:21:01 crc kubenswrapper[4994]: > Mar 10 00:21:01 crc kubenswrapper[4994]: I0310 00:21:01.732926 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"46c9545f-e40a-413b-834d-b428cedc906b","Type":"ContainerStarted","Data":"7902da0dd9911d9b653dd31e0ee68587f1799979b11fd8b932f59d984685714a"} Mar 10 00:21:06 crc kubenswrapper[4994]: I0310 00:21:06.576585 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:21:06 crc kubenswrapper[4994]: I0310 00:21:06.615764 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-2-build" podStartSLOduration=23.615741845 podStartE2EDuration="23.615741845s" podCreationTimestamp="2026-03-10 00:20:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:21:01.781036349 +0000 UTC m=+875.954743098" watchObservedRunningTime="2026-03-10 00:21:06.615741845 +0000 UTC m=+880.789448594" Mar 10 00:21:18 crc kubenswrapper[4994]: I0310 00:21:18.892751 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:21:18 crc kubenswrapper[4994]: I0310 00:21:18.893258 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:21:48 crc kubenswrapper[4994]: I0310 00:21:48.892132 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:21:48 crc kubenswrapper[4994]: I0310 00:21:48.892735 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:21:48 crc kubenswrapper[4994]: I0310 00:21:48.892795 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:21:48 crc kubenswrapper[4994]: I0310 00:21:48.893645 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"226ba5dd02665930d82054325a4e53e30bff51d1812a48cc472d4cc84e6237db"} pod="openshift-machine-config-operator/machine-config-daemon-kfljj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:21:48 crc kubenswrapper[4994]: I0310 00:21:48.893730 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" containerID="cri-o://226ba5dd02665930d82054325a4e53e30bff51d1812a48cc472d4cc84e6237db" gracePeriod=600 Mar 10 00:21:49 crc kubenswrapper[4994]: E0310 00:21:49.001182 4994 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podced5d66d_39df_4267_b801_e1e60d517ace.slice/crio-226ba5dd02665930d82054325a4e53e30bff51d1812a48cc472d4cc84e6237db.scope\": RecentStats: unable to find data in memory cache]" Mar 10 00:21:49 crc kubenswrapper[4994]: I0310 00:21:49.085332 4994 generic.go:334] "Generic (PLEG): container finished" podID="ced5d66d-39df-4267-b801-e1e60d517ace" containerID="226ba5dd02665930d82054325a4e53e30bff51d1812a48cc472d4cc84e6237db" exitCode=0 Mar 10 00:21:49 crc kubenswrapper[4994]: I0310 00:21:49.085394 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerDied","Data":"226ba5dd02665930d82054325a4e53e30bff51d1812a48cc472d4cc84e6237db"} Mar 10 00:21:49 crc kubenswrapper[4994]: I0310 00:21:49.085648 4994 scope.go:117] "RemoveContainer" containerID="3a58b30808ba3fd3b4a9259ace5f595c7d1bd5910098d24eb4a7ede149499cfa" Mar 10 00:21:50 crc kubenswrapper[4994]: I0310 00:21:50.097697 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"4df6fccd58598d7ef69c17858b0dc84c63fc5b8f887c3889c08c1cb8b68f120e"} Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.151275 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551702-qz8hb"] Mar 10 00:22:00 crc kubenswrapper[4994]: E0310 00:22:00.152259 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6fc7b86-bebf-4721-a8c1-88169e4ec64e" containerName="manage-dockerfile" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.152280 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6fc7b86-bebf-4721-a8c1-88169e4ec64e" containerName="manage-dockerfile" Mar 10 00:22:00 crc kubenswrapper[4994]: E0310 00:22:00.152311 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6fc7b86-bebf-4721-a8c1-88169e4ec64e" containerName="docker-build" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.152323 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6fc7b86-bebf-4721-a8c1-88169e4ec64e" containerName="docker-build" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.152507 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6fc7b86-bebf-4721-a8c1-88169e4ec64e" containerName="docker-build" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.153144 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551702-qz8hb" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.157058 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.159935 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.160023 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.162223 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551702-qz8hb"] Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.179711 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd2zd\" (UniqueName: \"kubernetes.io/projected/0dbd271f-5f29-4221-bfbe-2274ce440c29-kube-api-access-kd2zd\") pod \"auto-csr-approver-29551702-qz8hb\" (UID: \"0dbd271f-5f29-4221-bfbe-2274ce440c29\") " pod="openshift-infra/auto-csr-approver-29551702-qz8hb" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.280446 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd2zd\" (UniqueName: \"kubernetes.io/projected/0dbd271f-5f29-4221-bfbe-2274ce440c29-kube-api-access-kd2zd\") pod \"auto-csr-approver-29551702-qz8hb\" (UID: \"0dbd271f-5f29-4221-bfbe-2274ce440c29\") " pod="openshift-infra/auto-csr-approver-29551702-qz8hb" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.302077 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd2zd\" (UniqueName: \"kubernetes.io/projected/0dbd271f-5f29-4221-bfbe-2274ce440c29-kube-api-access-kd2zd\") pod \"auto-csr-approver-29551702-qz8hb\" (UID: \"0dbd271f-5f29-4221-bfbe-2274ce440c29\") " pod="openshift-infra/auto-csr-approver-29551702-qz8hb" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.470328 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551702-qz8hb" Mar 10 00:22:00 crc kubenswrapper[4994]: I0310 00:22:00.685389 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551702-qz8hb"] Mar 10 00:22:01 crc kubenswrapper[4994]: I0310 00:22:01.187038 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551702-qz8hb" event={"ID":"0dbd271f-5f29-4221-bfbe-2274ce440c29","Type":"ContainerStarted","Data":"53856e54815dff5bb08b5f00e560176a1c7b6df0ea506a96af56d1a1e3dd8b10"} Mar 10 00:22:03 crc kubenswrapper[4994]: I0310 00:22:03.200173 4994 generic.go:334] "Generic (PLEG): container finished" podID="0dbd271f-5f29-4221-bfbe-2274ce440c29" containerID="579778a9ca15bde63cc28eb094f98e7025e92dc67fe3a2f215a9a028c2966910" exitCode=0 Mar 10 00:22:03 crc kubenswrapper[4994]: I0310 00:22:03.200317 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551702-qz8hb" event={"ID":"0dbd271f-5f29-4221-bfbe-2274ce440c29","Type":"ContainerDied","Data":"579778a9ca15bde63cc28eb094f98e7025e92dc67fe3a2f215a9a028c2966910"} Mar 10 00:22:04 crc kubenswrapper[4994]: I0310 00:22:04.521859 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551702-qz8hb" Mar 10 00:22:04 crc kubenswrapper[4994]: I0310 00:22:04.539806 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd2zd\" (UniqueName: \"kubernetes.io/projected/0dbd271f-5f29-4221-bfbe-2274ce440c29-kube-api-access-kd2zd\") pod \"0dbd271f-5f29-4221-bfbe-2274ce440c29\" (UID: \"0dbd271f-5f29-4221-bfbe-2274ce440c29\") " Mar 10 00:22:04 crc kubenswrapper[4994]: I0310 00:22:04.546736 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dbd271f-5f29-4221-bfbe-2274ce440c29-kube-api-access-kd2zd" (OuterVolumeSpecName: "kube-api-access-kd2zd") pod "0dbd271f-5f29-4221-bfbe-2274ce440c29" (UID: "0dbd271f-5f29-4221-bfbe-2274ce440c29"). InnerVolumeSpecName "kube-api-access-kd2zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:22:04 crc kubenswrapper[4994]: I0310 00:22:04.641710 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd2zd\" (UniqueName: \"kubernetes.io/projected/0dbd271f-5f29-4221-bfbe-2274ce440c29-kube-api-access-kd2zd\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:05 crc kubenswrapper[4994]: I0310 00:22:05.218633 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551702-qz8hb" event={"ID":"0dbd271f-5f29-4221-bfbe-2274ce440c29","Type":"ContainerDied","Data":"53856e54815dff5bb08b5f00e560176a1c7b6df0ea506a96af56d1a1e3dd8b10"} Mar 10 00:22:05 crc kubenswrapper[4994]: I0310 00:22:05.218690 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53856e54815dff5bb08b5f00e560176a1c7b6df0ea506a96af56d1a1e3dd8b10" Mar 10 00:22:05 crc kubenswrapper[4994]: I0310 00:22:05.218696 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551702-qz8hb" Mar 10 00:22:05 crc kubenswrapper[4994]: I0310 00:22:05.604474 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551696-w9ztg"] Mar 10 00:22:05 crc kubenswrapper[4994]: I0310 00:22:05.612164 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551696-w9ztg"] Mar 10 00:22:06 crc kubenswrapper[4994]: I0310 00:22:06.564105 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f25bd204-3572-4880-b74f-764a5a3e0123" path="/var/lib/kubelet/pods/f25bd204-3572-4880-b74f-764a5a3e0123/volumes" Mar 10 00:22:21 crc kubenswrapper[4994]: I0310 00:22:21.374467 4994 scope.go:117] "RemoveContainer" containerID="60b85c7f8cd24fb6dc7f7bf060fdb2ddff2c7fdefc6188b8ccedeba460b2b511" Mar 10 00:22:29 crc kubenswrapper[4994]: I0310 00:22:29.404469 4994 generic.go:334] "Generic (PLEG): container finished" podID="46c9545f-e40a-413b-834d-b428cedc906b" containerID="7902da0dd9911d9b653dd31e0ee68587f1799979b11fd8b932f59d984685714a" exitCode=0 Mar 10 00:22:29 crc kubenswrapper[4994]: I0310 00:22:29.405225 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"46c9545f-e40a-413b-834d-b428cedc906b","Type":"ContainerDied","Data":"7902da0dd9911d9b653dd31e0ee68587f1799979b11fd8b932f59d984685714a"} Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.763187 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.930667 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-build-blob-cache\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.930806 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-ca-bundles\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.930914 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-root\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931000 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-pull\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931071 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-system-configs\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931116 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdv98\" (UniqueName: \"kubernetes.io/projected/46c9545f-e40a-413b-834d-b428cedc906b-kube-api-access-pdv98\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931173 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-node-pullsecrets\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931237 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-buildcachedir\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931364 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931400 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931486 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-push\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931568 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-proxy-ca-bundles\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931686 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-buildworkdir\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.931994 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.932039 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.932428 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.933038 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-run\") pod \"46c9545f-e40a-413b-834d-b428cedc906b\" (UID: \"46c9545f-e40a-413b-834d-b428cedc906b\") " Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.933407 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.933434 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.933453 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.933471 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/46c9545f-e40a-413b-834d-b428cedc906b-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.933488 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46c9545f-e40a-413b-834d-b428cedc906b-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.934650 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.944576 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.950051 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46c9545f-e40a-413b-834d-b428cedc906b-kube-api-access-pdv98" (OuterVolumeSpecName: "kube-api-access-pdv98") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "kube-api-access-pdv98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.951061 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:22:30 crc kubenswrapper[4994]: I0310 00:22:30.993366 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.034324 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.034608 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.034690 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.034773 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/46c9545f-e40a-413b-834d-b428cedc906b-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.034846 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdv98\" (UniqueName: \"kubernetes.io/projected/46c9545f-e40a-413b-834d-b428cedc906b-kube-api-access-pdv98\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.129628 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.137004 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.424678 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"46c9545f-e40a-413b-834d-b428cedc906b","Type":"ContainerDied","Data":"d087a7e67b101576e99855a78361be7581843febff904371846d8544b91e846f"} Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.424743 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d087a7e67b101576e99855a78361be7581843febff904371846d8544b91e846f" Mar 10 00:22:31 crc kubenswrapper[4994]: I0310 00:22:31.424861 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:22:33 crc kubenswrapper[4994]: I0310 00:22:33.669335 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "46c9545f-e40a-413b-834d-b428cedc906b" (UID: "46c9545f-e40a-413b-834d-b428cedc906b"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:33 crc kubenswrapper[4994]: I0310 00:22:33.677511 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/46c9545f-e40a-413b-834d-b428cedc906b-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.142217 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 10 00:22:35 crc kubenswrapper[4994]: E0310 00:22:35.142554 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dbd271f-5f29-4221-bfbe-2274ce440c29" containerName="oc" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.142575 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dbd271f-5f29-4221-bfbe-2274ce440c29" containerName="oc" Mar 10 00:22:35 crc kubenswrapper[4994]: E0310 00:22:35.142597 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c9545f-e40a-413b-834d-b428cedc906b" containerName="manage-dockerfile" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.142609 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c9545f-e40a-413b-834d-b428cedc906b" containerName="manage-dockerfile" Mar 10 00:22:35 crc kubenswrapper[4994]: E0310 00:22:35.142635 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c9545f-e40a-413b-834d-b428cedc906b" containerName="git-clone" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.142649 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c9545f-e40a-413b-834d-b428cedc906b" containerName="git-clone" Mar 10 00:22:35 crc kubenswrapper[4994]: E0310 00:22:35.142679 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c9545f-e40a-413b-834d-b428cedc906b" containerName="docker-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.142691 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c9545f-e40a-413b-834d-b428cedc906b" containerName="docker-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.142898 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dbd271f-5f29-4221-bfbe-2274ce440c29" containerName="oc" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.142919 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="46c9545f-e40a-413b-834d-b428cedc906b" containerName="docker-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.144150 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.146800 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-sys-config" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.146803 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.147476 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-ca" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.148641 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-global-ca" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.166605 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.199991 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200109 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4ttb\" (UniqueName: \"kubernetes.io/projected/10f5518f-f5ed-47fb-b443-fae9128aec81-kube-api-access-m4ttb\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200156 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200191 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200220 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200270 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200328 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200367 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200399 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200432 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200477 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.200523 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.301747 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.301849 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4ttb\" (UniqueName: \"kubernetes.io/projected/10f5518f-f5ed-47fb-b443-fae9128aec81-kube-api-access-m4ttb\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.301916 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.301953 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.301985 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.302037 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.302093 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.302131 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.302164 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.302196 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.302242 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.302462 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.302852 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.302961 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.303047 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.303304 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.303461 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.303751 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.304073 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.304255 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.304721 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.310701 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.313211 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.333718 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4ttb\" (UniqueName: \"kubernetes.io/projected/10f5518f-f5ed-47fb-b443-fae9128aec81-kube-api-access-m4ttb\") pod \"smart-gateway-operator-1-build\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.469459 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:35 crc kubenswrapper[4994]: I0310 00:22:35.681524 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 10 00:22:36 crc kubenswrapper[4994]: I0310 00:22:36.470650 4994 generic.go:334] "Generic (PLEG): container finished" podID="10f5518f-f5ed-47fb-b443-fae9128aec81" containerID="eb388fa374116b603771455669ec292f8665b546692a48ed5cd70354b73fe681" exitCode=0 Mar 10 00:22:36 crc kubenswrapper[4994]: I0310 00:22:36.470749 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"10f5518f-f5ed-47fb-b443-fae9128aec81","Type":"ContainerDied","Data":"eb388fa374116b603771455669ec292f8665b546692a48ed5cd70354b73fe681"} Mar 10 00:22:36 crc kubenswrapper[4994]: I0310 00:22:36.471071 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"10f5518f-f5ed-47fb-b443-fae9128aec81","Type":"ContainerStarted","Data":"2a6c5f55e005a4b106f0f86645e6873333919851ee6a45e984f93fbe7bf04a2e"} Mar 10 00:22:37 crc kubenswrapper[4994]: I0310 00:22:37.482268 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"10f5518f-f5ed-47fb-b443-fae9128aec81","Type":"ContainerStarted","Data":"5b7a38adc0c6a78554c415c32e9eb6f227d97fceb20d2f07317cb0b1675be632"} Mar 10 00:22:37 crc kubenswrapper[4994]: I0310 00:22:37.520320 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=2.520295173 podStartE2EDuration="2.520295173s" podCreationTimestamp="2026-03-10 00:22:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:22:37.511956998 +0000 UTC m=+971.685663767" watchObservedRunningTime="2026-03-10 00:22:37.520295173 +0000 UTC m=+971.694001962" Mar 10 00:22:46 crc kubenswrapper[4994]: I0310 00:22:46.061542 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 10 00:22:46 crc kubenswrapper[4994]: I0310 00:22:46.062483 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="10f5518f-f5ed-47fb-b443-fae9128aec81" containerName="docker-build" containerID="cri-o://5b7a38adc0c6a78554c415c32e9eb6f227d97fceb20d2f07317cb0b1675be632" gracePeriod=30 Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.560709 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_10f5518f-f5ed-47fb-b443-fae9128aec81/docker-build/0.log" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.561505 4994 generic.go:334] "Generic (PLEG): container finished" podID="10f5518f-f5ed-47fb-b443-fae9128aec81" containerID="5b7a38adc0c6a78554c415c32e9eb6f227d97fceb20d2f07317cb0b1675be632" exitCode=1 Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.561555 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"10f5518f-f5ed-47fb-b443-fae9128aec81","Type":"ContainerDied","Data":"5b7a38adc0c6a78554c415c32e9eb6f227d97fceb20d2f07317cb0b1675be632"} Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.561637 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"10f5518f-f5ed-47fb-b443-fae9128aec81","Type":"ContainerDied","Data":"2a6c5f55e005a4b106f0f86645e6873333919851ee6a45e984f93fbe7bf04a2e"} Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.561660 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a6c5f55e005a4b106f0f86645e6873333919851ee6a45e984f93fbe7bf04a2e" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.608719 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_10f5518f-f5ed-47fb-b443-fae9128aec81/docker-build/0.log" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.609363 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718340 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-ca-bundles\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718416 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-buildcachedir\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718449 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-pull\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718481 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-proxy-ca-bundles\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718508 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-buildworkdir\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718579 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-push\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718606 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-system-configs\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718645 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-node-pullsecrets\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718822 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.719286 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.719328 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.719365 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.719928 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.721050 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.718680 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4ttb\" (UniqueName: \"kubernetes.io/projected/10f5518f-f5ed-47fb-b443-fae9128aec81-kube-api-access-m4ttb\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.721176 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-root\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.721209 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-run\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.721241 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-build-blob-cache\") pod \"10f5518f-f5ed-47fb-b443-fae9128aec81\" (UID: \"10f5518f-f5ed-47fb-b443-fae9128aec81\") " Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.722370 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.722810 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.722829 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.722842 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.722853 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/10f5518f-f5ed-47fb-b443-fae9128aec81-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.722864 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.722898 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.722913 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/10f5518f-f5ed-47fb-b443-fae9128aec81-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.723342 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 10 00:22:47 crc kubenswrapper[4994]: E0310 00:22:47.723738 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f5518f-f5ed-47fb-b443-fae9128aec81" containerName="docker-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.723773 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f5518f-f5ed-47fb-b443-fae9128aec81" containerName="docker-build" Mar 10 00:22:47 crc kubenswrapper[4994]: E0310 00:22:47.723796 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f5518f-f5ed-47fb-b443-fae9128aec81" containerName="manage-dockerfile" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.723805 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f5518f-f5ed-47fb-b443-fae9128aec81" containerName="manage-dockerfile" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.723950 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="10f5518f-f5ed-47fb-b443-fae9128aec81" containerName="docker-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.724773 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.726468 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.727007 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.736507 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-sys-config" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.736767 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-global-ca" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.736949 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-ca" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.742163 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.746097 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10f5518f-f5ed-47fb-b443-fae9128aec81-kube-api-access-m4ttb" (OuterVolumeSpecName: "kube-api-access-m4ttb") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "kube-api-access-m4ttb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.824617 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.824694 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.824745 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.824793 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.824838 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.824899 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.824962 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.825001 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.825072 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.825122 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.825170 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.825337 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thrgh\" (UniqueName: \"kubernetes.io/projected/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-kube-api-access-thrgh\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.825502 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.825528 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4ttb\" (UniqueName: \"kubernetes.io/projected/10f5518f-f5ed-47fb-b443-fae9128aec81-kube-api-access-m4ttb\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.825543 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/10f5518f-f5ed-47fb-b443-fae9128aec81-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.916905 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.926253 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.926345 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.926403 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.926482 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.926535 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.926573 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.927773 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.927839 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.927163 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.927839 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.928105 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thrgh\" (UniqueName: \"kubernetes.io/projected/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-kube-api-access-thrgh\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.926662 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.928299 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.927068 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.928719 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.928750 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.928854 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.929033 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.929103 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.929226 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.929319 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.929945 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.948176 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.948186 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:47 crc kubenswrapper[4994]: I0310 00:22:47.952965 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thrgh\" (UniqueName: \"kubernetes.io/projected/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-kube-api-access-thrgh\") pod \"smart-gateway-operator-2-build\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:48 crc kubenswrapper[4994]: I0310 00:22:48.087433 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:48 crc kubenswrapper[4994]: I0310 00:22:48.158114 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "10f5518f-f5ed-47fb-b443-fae9128aec81" (UID: "10f5518f-f5ed-47fb-b443-fae9128aec81"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:48 crc kubenswrapper[4994]: I0310 00:22:48.234463 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/10f5518f-f5ed-47fb-b443-fae9128aec81-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:48 crc kubenswrapper[4994]: I0310 00:22:48.323998 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 10 00:22:48 crc kubenswrapper[4994]: I0310 00:22:48.572087 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"998fc9e9-e892-4f8f-9c76-6a6129ab98e7","Type":"ContainerStarted","Data":"c8f48d2ddd1148a083fa4a3e0e28f96eeb589b9ec52ced72fd24a2fc874d5874"} Mar 10 00:22:48 crc kubenswrapper[4994]: I0310 00:22:48.572120 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:48 crc kubenswrapper[4994]: I0310 00:22:48.632955 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 10 00:22:48 crc kubenswrapper[4994]: I0310 00:22:48.642949 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 10 00:22:49 crc kubenswrapper[4994]: I0310 00:22:49.583173 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"998fc9e9-e892-4f8f-9c76-6a6129ab98e7","Type":"ContainerStarted","Data":"bc29d20723fae562ff49599b98ec8b08c65f2481204182fd5331b443e783e6ed"} Mar 10 00:22:50 crc kubenswrapper[4994]: I0310 00:22:50.567171 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10f5518f-f5ed-47fb-b443-fae9128aec81" path="/var/lib/kubelet/pods/10f5518f-f5ed-47fb-b443-fae9128aec81/volumes" Mar 10 00:22:50 crc kubenswrapper[4994]: I0310 00:22:50.594185 4994 generic.go:334] "Generic (PLEG): container finished" podID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerID="bc29d20723fae562ff49599b98ec8b08c65f2481204182fd5331b443e783e6ed" exitCode=0 Mar 10 00:22:50 crc kubenswrapper[4994]: I0310 00:22:50.594246 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"998fc9e9-e892-4f8f-9c76-6a6129ab98e7","Type":"ContainerDied","Data":"bc29d20723fae562ff49599b98ec8b08c65f2481204182fd5331b443e783e6ed"} Mar 10 00:22:51 crc kubenswrapper[4994]: I0310 00:22:51.607237 4994 generic.go:334] "Generic (PLEG): container finished" podID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerID="29ab98ad95069b94cd163c490a7f86c25c7ed6375e4010a92852a665528555c8" exitCode=0 Mar 10 00:22:51 crc kubenswrapper[4994]: I0310 00:22:51.607310 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"998fc9e9-e892-4f8f-9c76-6a6129ab98e7","Type":"ContainerDied","Data":"29ab98ad95069b94cd163c490a7f86c25c7ed6375e4010a92852a665528555c8"} Mar 10 00:22:51 crc kubenswrapper[4994]: I0310 00:22:51.649571 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_998fc9e9-e892-4f8f-9c76-6a6129ab98e7/manage-dockerfile/0.log" Mar 10 00:22:52 crc kubenswrapper[4994]: I0310 00:22:52.618001 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"998fc9e9-e892-4f8f-9c76-6a6129ab98e7","Type":"ContainerStarted","Data":"54bdbba3bbbc238dc5b0ebf2dbf751d59ee9f9abc3b40893b0cd282f9a3228f6"} Mar 10 00:22:52 crc kubenswrapper[4994]: I0310 00:22:52.659553 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=5.659529983 podStartE2EDuration="5.659529983s" podCreationTimestamp="2026-03-10 00:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:22:52.6505043 +0000 UTC m=+986.824211079" watchObservedRunningTime="2026-03-10 00:22:52.659529983 +0000 UTC m=+986.833236762" Mar 10 00:23:56 crc kubenswrapper[4994]: I0310 00:23:56.062847 4994 generic.go:334] "Generic (PLEG): container finished" podID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerID="54bdbba3bbbc238dc5b0ebf2dbf751d59ee9f9abc3b40893b0cd282f9a3228f6" exitCode=0 Mar 10 00:23:56 crc kubenswrapper[4994]: I0310 00:23:56.062986 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"998fc9e9-e892-4f8f-9c76-6a6129ab98e7","Type":"ContainerDied","Data":"54bdbba3bbbc238dc5b0ebf2dbf751d59ee9f9abc3b40893b0cd282f9a3228f6"} Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.381849 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.494159 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-node-pullsecrets\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.494323 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.494548 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildworkdir\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.494714 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-run\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.494780 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-ca-bundles\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.494839 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-blob-cache\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.494948 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-system-configs\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.494986 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildcachedir\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495048 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thrgh\" (UniqueName: \"kubernetes.io/projected/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-kube-api-access-thrgh\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495091 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-proxy-ca-bundles\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495157 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-push\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495183 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495245 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-root\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495331 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-pull\") pod \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\" (UID: \"998fc9e9-e892-4f8f-9c76-6a6129ab98e7\") " Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495730 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495798 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495968 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495984 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.495994 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.496002 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.496066 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.496336 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.500921 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.509505 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.509540 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-kube-api-access-thrgh" (OuterVolumeSpecName: "kube-api-access-thrgh") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "kube-api-access-thrgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.509540 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.598944 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thrgh\" (UniqueName: \"kubernetes.io/projected/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-kube-api-access-thrgh\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.599024 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.599038 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.599051 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.599062 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.599076 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.746433 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:23:57 crc kubenswrapper[4994]: I0310 00:23:57.803704 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:58 crc kubenswrapper[4994]: I0310 00:23:58.080536 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"998fc9e9-e892-4f8f-9c76-6a6129ab98e7","Type":"ContainerDied","Data":"c8f48d2ddd1148a083fa4a3e0e28f96eeb589b9ec52ced72fd24a2fc874d5874"} Mar 10 00:23:58 crc kubenswrapper[4994]: I0310 00:23:58.080584 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8f48d2ddd1148a083fa4a3e0e28f96eeb589b9ec52ced72fd24a2fc874d5874" Mar 10 00:23:58 crc kubenswrapper[4994]: I0310 00:23:58.080673 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:23:59 crc kubenswrapper[4994]: I0310 00:23:59.537898 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "998fc9e9-e892-4f8f-9c76-6a6129ab98e7" (UID: "998fc9e9-e892-4f8f-9c76-6a6129ab98e7"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:23:59 crc kubenswrapper[4994]: I0310 00:23:59.630755 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/998fc9e9-e892-4f8f-9c76-6a6129ab98e7-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.156351 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551704-8n96k"] Mar 10 00:24:00 crc kubenswrapper[4994]: E0310 00:24:00.157365 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerName="docker-build" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.157395 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerName="docker-build" Mar 10 00:24:00 crc kubenswrapper[4994]: E0310 00:24:00.157462 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerName="git-clone" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.157479 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerName="git-clone" Mar 10 00:24:00 crc kubenswrapper[4994]: E0310 00:24:00.157524 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerName="manage-dockerfile" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.157543 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerName="manage-dockerfile" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.158097 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="998fc9e9-e892-4f8f-9c76-6a6129ab98e7" containerName="docker-build" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.159549 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551704-8n96k" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.164894 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.171065 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.171065 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.174169 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551704-8n96k"] Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.339958 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-754pt\" (UniqueName: \"kubernetes.io/projected/ca1779b4-8945-4667-b086-b7481edf1099-kube-api-access-754pt\") pod \"auto-csr-approver-29551704-8n96k\" (UID: \"ca1779b4-8945-4667-b086-b7481edf1099\") " pod="openshift-infra/auto-csr-approver-29551704-8n96k" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.443520 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-754pt\" (UniqueName: \"kubernetes.io/projected/ca1779b4-8945-4667-b086-b7481edf1099-kube-api-access-754pt\") pod \"auto-csr-approver-29551704-8n96k\" (UID: \"ca1779b4-8945-4667-b086-b7481edf1099\") " pod="openshift-infra/auto-csr-approver-29551704-8n96k" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.473474 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-754pt\" (UniqueName: \"kubernetes.io/projected/ca1779b4-8945-4667-b086-b7481edf1099-kube-api-access-754pt\") pod \"auto-csr-approver-29551704-8n96k\" (UID: \"ca1779b4-8945-4667-b086-b7481edf1099\") " pod="openshift-infra/auto-csr-approver-29551704-8n96k" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.492206 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551704-8n96k" Mar 10 00:24:00 crc kubenswrapper[4994]: I0310 00:24:00.915558 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551704-8n96k"] Mar 10 00:24:01 crc kubenswrapper[4994]: I0310 00:24:01.106600 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551704-8n96k" event={"ID":"ca1779b4-8945-4667-b086-b7481edf1099","Type":"ContainerStarted","Data":"f27aa240ed8c53a1f70a3cb47c55285d93f9439410091b3f051238f53f054807"} Mar 10 00:24:01 crc kubenswrapper[4994]: I0310 00:24:01.990006 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 10 00:24:01 crc kubenswrapper[4994]: I0310 00:24:01.991714 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 10 00:24:01 crc kubenswrapper[4994]: I0310 00:24:01.993537 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-ca" Mar 10 00:24:01 crc kubenswrapper[4994]: I0310 00:24:01.994248 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-sys-config" Mar 10 00:24:01 crc kubenswrapper[4994]: I0310 00:24:01.994745 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:24:01 crc kubenswrapper[4994]: I0310 00:24:01.997063 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-global-ca" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.017123 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.175542 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-system-configs\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.175654 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildcachedir\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.175701 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glhr2\" (UniqueName: \"kubernetes.io/projected/f71b7ce5-faba-47c4-b13e-5d333f06efc3-kube-api-access-glhr2\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.175851 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.176001 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.176203 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-root\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.176296 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.176366 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.176487 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildworkdir\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.176602 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-run\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.176835 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-pull\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.176931 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-push\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.287680 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-pull\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.287791 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-push\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.287852 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-system-configs\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.287961 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildcachedir\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.288022 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glhr2\" (UniqueName: \"kubernetes.io/projected/f71b7ce5-faba-47c4-b13e-5d333f06efc3-kube-api-access-glhr2\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.288100 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.288145 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.288210 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.288256 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-root\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.288306 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildcachedir\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.288943 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.290002 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-system-configs\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.288318 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.290133 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.290174 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildworkdir\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.290170 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-root\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.290261 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.290271 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-run\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.290430 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildworkdir\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:02 crc kubenswrapper[4994]: I0310 00:24:02.291658 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:04 crc kubenswrapper[4994]: I0310 00:24:04.293812 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-run\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:04 crc kubenswrapper[4994]: I0310 00:24:04.294758 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-push\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:04 crc kubenswrapper[4994]: I0310 00:24:04.294792 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-pull\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:04 crc kubenswrapper[4994]: I0310 00:24:04.306664 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glhr2\" (UniqueName: \"kubernetes.io/projected/f71b7ce5-faba-47c4-b13e-5d333f06efc3-kube-api-access-glhr2\") pod \"sg-core-1-build\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:24:04 crc kubenswrapper[4994]: I0310 00:24:04.410806 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 10 00:24:04 crc kubenswrapper[4994]: I0310 00:24:04.632609 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 10 00:24:04 crc kubenswrapper[4994]: W0310 00:24:04.642157 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf71b7ce5_faba_47c4_b13e_5d333f06efc3.slice/crio-4cfefb7397487a02866f52632158f74eea10ce28127274c9486d96660b536555 WatchSource:0}: Error finding container 4cfefb7397487a02866f52632158f74eea10ce28127274c9486d96660b536555: Status 404 returned error can't find the container with id 4cfefb7397487a02866f52632158f74eea10ce28127274c9486d96660b536555 Mar 10 00:24:05 crc kubenswrapper[4994]: I0310 00:24:05.142413 4994 generic.go:334] "Generic (PLEG): container finished" podID="f71b7ce5-faba-47c4-b13e-5d333f06efc3" containerID="3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36" exitCode=0 Mar 10 00:24:05 crc kubenswrapper[4994]: I0310 00:24:05.142492 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f71b7ce5-faba-47c4-b13e-5d333f06efc3","Type":"ContainerDied","Data":"3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36"} Mar 10 00:24:05 crc kubenswrapper[4994]: I0310 00:24:05.143076 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f71b7ce5-faba-47c4-b13e-5d333f06efc3","Type":"ContainerStarted","Data":"4cfefb7397487a02866f52632158f74eea10ce28127274c9486d96660b536555"} Mar 10 00:24:05 crc kubenswrapper[4994]: I0310 00:24:05.148835 4994 generic.go:334] "Generic (PLEG): container finished" podID="ca1779b4-8945-4667-b086-b7481edf1099" containerID="05dbb86f2a8b07de1b18fc5d17c0892e037b64a657609562e3eb766699973201" exitCode=0 Mar 10 00:24:05 crc kubenswrapper[4994]: I0310 00:24:05.149103 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551704-8n96k" event={"ID":"ca1779b4-8945-4667-b086-b7481edf1099","Type":"ContainerDied","Data":"05dbb86f2a8b07de1b18fc5d17c0892e037b64a657609562e3eb766699973201"} Mar 10 00:24:06 crc kubenswrapper[4994]: I0310 00:24:06.163555 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f71b7ce5-faba-47c4-b13e-5d333f06efc3","Type":"ContainerStarted","Data":"5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16"} Mar 10 00:24:06 crc kubenswrapper[4994]: I0310 00:24:06.197541 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=5.19751942 podStartE2EDuration="5.19751942s" podCreationTimestamp="2026-03-10 00:24:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:24:06.192634898 +0000 UTC m=+1060.366341667" watchObservedRunningTime="2026-03-10 00:24:06.19751942 +0000 UTC m=+1060.371226169" Mar 10 00:24:06 crc kubenswrapper[4994]: I0310 00:24:06.427575 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551704-8n96k" Mar 10 00:24:06 crc kubenswrapper[4994]: I0310 00:24:06.456002 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-754pt\" (UniqueName: \"kubernetes.io/projected/ca1779b4-8945-4667-b086-b7481edf1099-kube-api-access-754pt\") pod \"ca1779b4-8945-4667-b086-b7481edf1099\" (UID: \"ca1779b4-8945-4667-b086-b7481edf1099\") " Mar 10 00:24:06 crc kubenswrapper[4994]: I0310 00:24:06.464088 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca1779b4-8945-4667-b086-b7481edf1099-kube-api-access-754pt" (OuterVolumeSpecName: "kube-api-access-754pt") pod "ca1779b4-8945-4667-b086-b7481edf1099" (UID: "ca1779b4-8945-4667-b086-b7481edf1099"). InnerVolumeSpecName "kube-api-access-754pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:24:06 crc kubenswrapper[4994]: I0310 00:24:06.557149 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-754pt\" (UniqueName: \"kubernetes.io/projected/ca1779b4-8945-4667-b086-b7481edf1099-kube-api-access-754pt\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:07 crc kubenswrapper[4994]: I0310 00:24:07.174331 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551704-8n96k" event={"ID":"ca1779b4-8945-4667-b086-b7481edf1099","Type":"ContainerDied","Data":"f27aa240ed8c53a1f70a3cb47c55285d93f9439410091b3f051238f53f054807"} Mar 10 00:24:07 crc kubenswrapper[4994]: I0310 00:24:07.174384 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551704-8n96k" Mar 10 00:24:07 crc kubenswrapper[4994]: I0310 00:24:07.174413 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f27aa240ed8c53a1f70a3cb47c55285d93f9439410091b3f051238f53f054807" Mar 10 00:24:07 crc kubenswrapper[4994]: I0310 00:24:07.502917 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551698-2d6g5"] Mar 10 00:24:07 crc kubenswrapper[4994]: I0310 00:24:07.512360 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551698-2d6g5"] Mar 10 00:24:08 crc kubenswrapper[4994]: I0310 00:24:08.570300 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6471fd89-1c92-498d-ba15-149418259c58" path="/var/lib/kubelet/pods/6471fd89-1c92-498d-ba15-149418259c58/volumes" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.331984 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.332474 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="f71b7ce5-faba-47c4-b13e-5d333f06efc3" containerName="docker-build" containerID="cri-o://5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16" gracePeriod=30 Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.747775 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_f71b7ce5-faba-47c4-b13e-5d333f06efc3/docker-build/0.log" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.748677 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.846463 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildworkdir\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.846508 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-pull\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.846540 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-proxy-ca-bundles\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.846569 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-node-pullsecrets\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.846586 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-ca-bundles\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.846602 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-push\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.847210 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.847337 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.847534 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.847614 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.852158 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.852587 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948008 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-system-configs\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948087 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-root\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948163 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glhr2\" (UniqueName: \"kubernetes.io/projected/f71b7ce5-faba-47c4-b13e-5d333f06efc3-kube-api-access-glhr2\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948191 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-blob-cache\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948211 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildcachedir\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948237 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-run\") pod \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\" (UID: \"f71b7ce5-faba-47c4-b13e-5d333f06efc3\") " Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948827 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948869 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948892 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948952 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948939 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.948971 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.949046 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/f71b7ce5-faba-47c4-b13e-5d333f06efc3-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.949408 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.949863 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:24:12 crc kubenswrapper[4994]: I0310 00:24:12.957470 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f71b7ce5-faba-47c4-b13e-5d333f06efc3-kube-api-access-glhr2" (OuterVolumeSpecName: "kube-api-access-glhr2") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "kube-api-access-glhr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.038198 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.049846 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f71b7ce5-faba-47c4-b13e-5d333f06efc3-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.049872 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.049881 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.049891 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f71b7ce5-faba-47c4-b13e-5d333f06efc3-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.049899 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glhr2\" (UniqueName: \"kubernetes.io/projected/f71b7ce5-faba-47c4-b13e-5d333f06efc3-kube-api-access-glhr2\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.088595 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "f71b7ce5-faba-47c4-b13e-5d333f06efc3" (UID: "f71b7ce5-faba-47c4-b13e-5d333f06efc3"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.151142 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f71b7ce5-faba-47c4-b13e-5d333f06efc3-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.222490 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_f71b7ce5-faba-47c4-b13e-5d333f06efc3/docker-build/0.log" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.222996 4994 generic.go:334] "Generic (PLEG): container finished" podID="f71b7ce5-faba-47c4-b13e-5d333f06efc3" containerID="5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16" exitCode=1 Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.223047 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f71b7ce5-faba-47c4-b13e-5d333f06efc3","Type":"ContainerDied","Data":"5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16"} Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.223085 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"f71b7ce5-faba-47c4-b13e-5d333f06efc3","Type":"ContainerDied","Data":"4cfefb7397487a02866f52632158f74eea10ce28127274c9486d96660b536555"} Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.223109 4994 scope.go:117] "RemoveContainer" containerID="5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.223200 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.271736 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.273525 4994 scope.go:117] "RemoveContainer" containerID="3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.278875 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.303124 4994 scope.go:117] "RemoveContainer" containerID="5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16" Mar 10 00:24:13 crc kubenswrapper[4994]: E0310 00:24:13.303739 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16\": container with ID starting with 5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16 not found: ID does not exist" containerID="5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.303787 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16"} err="failed to get container status \"5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16\": rpc error: code = NotFound desc = could not find container \"5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16\": container with ID starting with 5f86a353bc57c392e4757718737202a2ae0ad6212088fe58cf7e9aaa9a859f16 not found: ID does not exist" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.303818 4994 scope.go:117] "RemoveContainer" containerID="3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36" Mar 10 00:24:13 crc kubenswrapper[4994]: E0310 00:24:13.304179 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36\": container with ID starting with 3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36 not found: ID does not exist" containerID="3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36" Mar 10 00:24:13 crc kubenswrapper[4994]: I0310 00:24:13.304200 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36"} err="failed to get container status \"3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36\": rpc error: code = NotFound desc = could not find container \"3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36\": container with ID starting with 3284a243f9ce93bf265d6ffa02a4f6b5ffa3ec59555f3bdc1991f7298bf2ec36 not found: ID does not exist" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.027092 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 10 00:24:14 crc kubenswrapper[4994]: E0310 00:24:14.027490 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f71b7ce5-faba-47c4-b13e-5d333f06efc3" containerName="docker-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.027517 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71b7ce5-faba-47c4-b13e-5d333f06efc3" containerName="docker-build" Mar 10 00:24:14 crc kubenswrapper[4994]: E0310 00:24:14.027539 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f71b7ce5-faba-47c4-b13e-5d333f06efc3" containerName="manage-dockerfile" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.027553 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71b7ce5-faba-47c4-b13e-5d333f06efc3" containerName="manage-dockerfile" Mar 10 00:24:14 crc kubenswrapper[4994]: E0310 00:24:14.027570 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1779b4-8945-4667-b086-b7481edf1099" containerName="oc" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.027582 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1779b4-8945-4667-b086-b7481edf1099" containerName="oc" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.027794 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="f71b7ce5-faba-47c4-b13e-5d333f06efc3" containerName="docker-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.027836 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca1779b4-8945-4667-b086-b7481edf1099" containerName="oc" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.029454 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.032138 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-global-ca" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.032859 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-sys-config" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.033274 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.033535 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-ca" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.045010 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.069365 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-run\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.069424 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-buildworkdir\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.069450 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-root\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.069530 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-system-configs\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.069635 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.069746 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-buildcachedir\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.069944 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-push\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.070050 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mgnf\" (UniqueName: \"kubernetes.io/projected/45ce1682-4d9a-4e51-ae93-a1832751d811-kube-api-access-7mgnf\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.070106 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-pull\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.070221 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.070263 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.070294 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172070 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172327 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172534 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-buildcachedir\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172483 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-buildcachedir\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172605 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-push\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172643 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mgnf\" (UniqueName: \"kubernetes.io/projected/45ce1682-4d9a-4e51-ae93-a1832751d811-kube-api-access-7mgnf\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172673 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-pull\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172727 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172759 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172786 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172832 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-run\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172859 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-buildworkdir\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172890 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-system-configs\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.172942 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-root\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.173274 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.173651 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-root\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.173736 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-run\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.174104 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.174152 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-buildworkdir\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.174268 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-system-configs\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.174369 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.176716 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-pull\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.176816 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-push\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.194510 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mgnf\" (UniqueName: \"kubernetes.io/projected/45ce1682-4d9a-4e51-ae93-a1832751d811-kube-api-access-7mgnf\") pod \"sg-core-2-build\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.359944 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.564663 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f71b7ce5-faba-47c4-b13e-5d333f06efc3" path="/var/lib/kubelet/pods/f71b7ce5-faba-47c4-b13e-5d333f06efc3/volumes" Mar 10 00:24:14 crc kubenswrapper[4994]: I0310 00:24:14.668476 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 10 00:24:15 crc kubenswrapper[4994]: I0310 00:24:15.241008 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"45ce1682-4d9a-4e51-ae93-a1832751d811","Type":"ContainerStarted","Data":"b38fd394341d0bca8c978c6208bc322f1303868e9303842247913a0129a0ba66"} Mar 10 00:24:15 crc kubenswrapper[4994]: I0310 00:24:15.241362 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"45ce1682-4d9a-4e51-ae93-a1832751d811","Type":"ContainerStarted","Data":"4cd2b54ef27177ce7bd06c1a0c62cb99ddedbaaaa4b7415ca746e6d6a53589fe"} Mar 10 00:24:16 crc kubenswrapper[4994]: I0310 00:24:16.250682 4994 generic.go:334] "Generic (PLEG): container finished" podID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerID="b38fd394341d0bca8c978c6208bc322f1303868e9303842247913a0129a0ba66" exitCode=0 Mar 10 00:24:16 crc kubenswrapper[4994]: I0310 00:24:16.250881 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"45ce1682-4d9a-4e51-ae93-a1832751d811","Type":"ContainerDied","Data":"b38fd394341d0bca8c978c6208bc322f1303868e9303842247913a0129a0ba66"} Mar 10 00:24:18 crc kubenswrapper[4994]: I0310 00:24:18.893074 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:24:18 crc kubenswrapper[4994]: I0310 00:24:18.893900 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:24:21 crc kubenswrapper[4994]: I0310 00:24:21.475015 4994 scope.go:117] "RemoveContainer" containerID="b02eba80da92e156810a460cd2fd2b2fbae8ce74141ff71d34b4fc6b8bc7db3f" Mar 10 00:24:23 crc kubenswrapper[4994]: I0310 00:24:23.317818 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"45ce1682-4d9a-4e51-ae93-a1832751d811","Type":"ContainerStarted","Data":"b7709b48ec49d7397d6227aa1d44c39adb7449f43816e0b86669024bb3c14d3c"} Mar 10 00:24:24 crc kubenswrapper[4994]: I0310 00:24:24.328182 4994 generic.go:334] "Generic (PLEG): container finished" podID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerID="b7709b48ec49d7397d6227aa1d44c39adb7449f43816e0b86669024bb3c14d3c" exitCode=0 Mar 10 00:24:24 crc kubenswrapper[4994]: I0310 00:24:24.328240 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"45ce1682-4d9a-4e51-ae93-a1832751d811","Type":"ContainerDied","Data":"b7709b48ec49d7397d6227aa1d44c39adb7449f43816e0b86669024bb3c14d3c"} Mar 10 00:24:25 crc kubenswrapper[4994]: I0310 00:24:25.337049 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"45ce1682-4d9a-4e51-ae93-a1832751d811","Type":"ContainerStarted","Data":"6966590713cea1dd54724874068e8b954718fd9eea114677b3e7a90c5394c8c0"} Mar 10 00:24:25 crc kubenswrapper[4994]: I0310 00:24:25.366629 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=12.366609381 podStartE2EDuration="12.366609381s" podCreationTimestamp="2026-03-10 00:24:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:24:25.364391881 +0000 UTC m=+1079.538098630" watchObservedRunningTime="2026-03-10 00:24:25.366609381 +0000 UTC m=+1079.540316130" Mar 10 00:24:48 crc kubenswrapper[4994]: I0310 00:24:48.892287 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:24:48 crc kubenswrapper[4994]: I0310 00:24:48.892953 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.102607 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xmg88"] Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.104286 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.122432 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xmg88"] Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.226383 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p55nx\" (UniqueName: \"kubernetes.io/projected/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-kube-api-access-p55nx\") pod \"redhat-operators-xmg88\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.226440 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-utilities\") pod \"redhat-operators-xmg88\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.226498 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-catalog-content\") pod \"redhat-operators-xmg88\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.327591 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-catalog-content\") pod \"redhat-operators-xmg88\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.327722 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p55nx\" (UniqueName: \"kubernetes.io/projected/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-kube-api-access-p55nx\") pod \"redhat-operators-xmg88\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.327744 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-utilities\") pod \"redhat-operators-xmg88\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.328187 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-catalog-content\") pod \"redhat-operators-xmg88\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.328257 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-utilities\") pod \"redhat-operators-xmg88\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.345815 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p55nx\" (UniqueName: \"kubernetes.io/projected/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-kube-api-access-p55nx\") pod \"redhat-operators-xmg88\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.423865 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:05 crc kubenswrapper[4994]: I0310 00:25:05.647789 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xmg88"] Mar 10 00:25:06 crc kubenswrapper[4994]: I0310 00:25:06.625057 4994 generic.go:334] "Generic (PLEG): container finished" podID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerID="2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259" exitCode=0 Mar 10 00:25:06 crc kubenswrapper[4994]: I0310 00:25:06.625147 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmg88" event={"ID":"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a","Type":"ContainerDied","Data":"2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259"} Mar 10 00:25:06 crc kubenswrapper[4994]: I0310 00:25:06.625374 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmg88" event={"ID":"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a","Type":"ContainerStarted","Data":"35da12795b727384764d62c7284523d0731f85372bc44799ea418bef94b3a2c0"} Mar 10 00:25:09 crc kubenswrapper[4994]: I0310 00:25:09.644417 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmg88" event={"ID":"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a","Type":"ContainerStarted","Data":"3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f"} Mar 10 00:25:11 crc kubenswrapper[4994]: I0310 00:25:11.661295 4994 generic.go:334] "Generic (PLEG): container finished" podID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerID="3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f" exitCode=0 Mar 10 00:25:11 crc kubenswrapper[4994]: I0310 00:25:11.661370 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmg88" event={"ID":"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a","Type":"ContainerDied","Data":"3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f"} Mar 10 00:25:12 crc kubenswrapper[4994]: I0310 00:25:12.672659 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmg88" event={"ID":"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a","Type":"ContainerStarted","Data":"e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313"} Mar 10 00:25:12 crc kubenswrapper[4994]: I0310 00:25:12.701261 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xmg88" podStartSLOduration=2.167660839 podStartE2EDuration="7.701227965s" podCreationTimestamp="2026-03-10 00:25:05 +0000 UTC" firstStartedPulling="2026-03-10 00:25:06.626711989 +0000 UTC m=+1120.800418738" lastFinishedPulling="2026-03-10 00:25:12.160279075 +0000 UTC m=+1126.333985864" observedRunningTime="2026-03-10 00:25:12.692547512 +0000 UTC m=+1126.866254351" watchObservedRunningTime="2026-03-10 00:25:12.701227965 +0000 UTC m=+1126.874934754" Mar 10 00:25:15 crc kubenswrapper[4994]: I0310 00:25:15.424350 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:15 crc kubenswrapper[4994]: I0310 00:25:15.424433 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:16 crc kubenswrapper[4994]: I0310 00:25:16.462526 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xmg88" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerName="registry-server" probeResult="failure" output=< Mar 10 00:25:16 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:25:16 crc kubenswrapper[4994]: > Mar 10 00:25:18 crc kubenswrapper[4994]: I0310 00:25:18.892984 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:25:18 crc kubenswrapper[4994]: I0310 00:25:18.894136 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:25:18 crc kubenswrapper[4994]: I0310 00:25:18.894251 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:25:18 crc kubenswrapper[4994]: I0310 00:25:18.895262 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4df6fccd58598d7ef69c17858b0dc84c63fc5b8f887c3889c08c1cb8b68f120e"} pod="openshift-machine-config-operator/machine-config-daemon-kfljj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:25:18 crc kubenswrapper[4994]: I0310 00:25:18.895369 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" containerID="cri-o://4df6fccd58598d7ef69c17858b0dc84c63fc5b8f887c3889c08c1cb8b68f120e" gracePeriod=600 Mar 10 00:25:19 crc kubenswrapper[4994]: I0310 00:25:19.734708 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerDied","Data":"4df6fccd58598d7ef69c17858b0dc84c63fc5b8f887c3889c08c1cb8b68f120e"} Mar 10 00:25:19 crc kubenswrapper[4994]: I0310 00:25:19.735184 4994 scope.go:117] "RemoveContainer" containerID="226ba5dd02665930d82054325a4e53e30bff51d1812a48cc472d4cc84e6237db" Mar 10 00:25:19 crc kubenswrapper[4994]: I0310 00:25:19.736227 4994 generic.go:334] "Generic (PLEG): container finished" podID="ced5d66d-39df-4267-b801-e1e60d517ace" containerID="4df6fccd58598d7ef69c17858b0dc84c63fc5b8f887c3889c08c1cb8b68f120e" exitCode=0 Mar 10 00:25:20 crc kubenswrapper[4994]: I0310 00:25:20.748456 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"779b11783e082c837efecec96b026cd3be87293636b7184dfd3efe1ae146c491"} Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.219336 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wb9zw"] Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.221456 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.226107 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wb9zw"] Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.318015 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg5hc\" (UniqueName: \"kubernetes.io/projected/a3025eba-bc72-4841-8d11-e07912a08204-kube-api-access-xg5hc\") pod \"community-operators-wb9zw\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.318399 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-catalog-content\") pod \"community-operators-wb9zw\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.318476 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-utilities\") pod \"community-operators-wb9zw\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.420076 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-utilities\") pod \"community-operators-wb9zw\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.420164 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg5hc\" (UniqueName: \"kubernetes.io/projected/a3025eba-bc72-4841-8d11-e07912a08204-kube-api-access-xg5hc\") pod \"community-operators-wb9zw\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.420185 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-catalog-content\") pod \"community-operators-wb9zw\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.420739 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-utilities\") pod \"community-operators-wb9zw\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.420763 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-catalog-content\") pod \"community-operators-wb9zw\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.450691 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg5hc\" (UniqueName: \"kubernetes.io/projected/a3025eba-bc72-4841-8d11-e07912a08204-kube-api-access-xg5hc\") pod \"community-operators-wb9zw\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.467807 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.505338 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.550238 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:25 crc kubenswrapper[4994]: I0310 00:25:25.828048 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wb9zw"] Mar 10 00:25:26 crc kubenswrapper[4994]: I0310 00:25:26.804190 4994 generic.go:334] "Generic (PLEG): container finished" podID="a3025eba-bc72-4841-8d11-e07912a08204" containerID="3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4" exitCode=0 Mar 10 00:25:26 crc kubenswrapper[4994]: I0310 00:25:26.804283 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9zw" event={"ID":"a3025eba-bc72-4841-8d11-e07912a08204","Type":"ContainerDied","Data":"3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4"} Mar 10 00:25:26 crc kubenswrapper[4994]: I0310 00:25:26.804640 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9zw" event={"ID":"a3025eba-bc72-4841-8d11-e07912a08204","Type":"ContainerStarted","Data":"a6eab4ce936f87d49ced6dfd2bb1e60e9d0032bca1642da7e1ee18e34523c380"} Mar 10 00:25:26 crc kubenswrapper[4994]: I0310 00:25:26.806597 4994 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 00:25:27 crc kubenswrapper[4994]: I0310 00:25:27.796942 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xmg88"] Mar 10 00:25:27 crc kubenswrapper[4994]: I0310 00:25:27.797532 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xmg88" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerName="registry-server" containerID="cri-o://e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313" gracePeriod=2 Mar 10 00:25:27 crc kubenswrapper[4994]: I0310 00:25:27.810375 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9zw" event={"ID":"a3025eba-bc72-4841-8d11-e07912a08204","Type":"ContainerStarted","Data":"07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8"} Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.191887 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.275650 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p55nx\" (UniqueName: \"kubernetes.io/projected/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-kube-api-access-p55nx\") pod \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.275719 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-catalog-content\") pod \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.275791 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-utilities\") pod \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\" (UID: \"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a\") " Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.276705 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-utilities" (OuterVolumeSpecName: "utilities") pod "6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" (UID: "6cedd32c-f8fb-4e43-b82e-57c0ae8d384a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.288511 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-kube-api-access-p55nx" (OuterVolumeSpecName: "kube-api-access-p55nx") pod "6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" (UID: "6cedd32c-f8fb-4e43-b82e-57c0ae8d384a"). InnerVolumeSpecName "kube-api-access-p55nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.377946 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p55nx\" (UniqueName: \"kubernetes.io/projected/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-kube-api-access-p55nx\") on node \"crc\" DevicePath \"\"" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.377992 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.444516 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" (UID: "6cedd32c-f8fb-4e43-b82e-57c0ae8d384a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.479828 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.817237 4994 generic.go:334] "Generic (PLEG): container finished" podID="a3025eba-bc72-4841-8d11-e07912a08204" containerID="07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8" exitCode=0 Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.817300 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9zw" event={"ID":"a3025eba-bc72-4841-8d11-e07912a08204","Type":"ContainerDied","Data":"07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8"} Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.820768 4994 generic.go:334] "Generic (PLEG): container finished" podID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerID="e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313" exitCode=0 Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.820792 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmg88" event={"ID":"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a","Type":"ContainerDied","Data":"e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313"} Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.820811 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmg88" event={"ID":"6cedd32c-f8fb-4e43-b82e-57c0ae8d384a","Type":"ContainerDied","Data":"35da12795b727384764d62c7284523d0731f85372bc44799ea418bef94b3a2c0"} Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.820813 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmg88" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.820861 4994 scope.go:117] "RemoveContainer" containerID="e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.837161 4994 scope.go:117] "RemoveContainer" containerID="3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.842263 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xmg88"] Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.848850 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xmg88"] Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.865772 4994 scope.go:117] "RemoveContainer" containerID="2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.898991 4994 scope.go:117] "RemoveContainer" containerID="e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313" Mar 10 00:25:28 crc kubenswrapper[4994]: E0310 00:25:28.899564 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313\": container with ID starting with e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313 not found: ID does not exist" containerID="e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.899701 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313"} err="failed to get container status \"e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313\": rpc error: code = NotFound desc = could not find container \"e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313\": container with ID starting with e05d94eb53528f1e24ee807e15d8e3e220dd6d472a73fd6e5406333edf173313 not found: ID does not exist" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.899794 4994 scope.go:117] "RemoveContainer" containerID="3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f" Mar 10 00:25:28 crc kubenswrapper[4994]: E0310 00:25:28.901815 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f\": container with ID starting with 3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f not found: ID does not exist" containerID="3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.901924 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f"} err="failed to get container status \"3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f\": rpc error: code = NotFound desc = could not find container \"3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f\": container with ID starting with 3f71db636bdf5b15dc2a4d88a1924368bf500283730d683c5d6ecf26d403142f not found: ID does not exist" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.901992 4994 scope.go:117] "RemoveContainer" containerID="2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259" Mar 10 00:25:28 crc kubenswrapper[4994]: E0310 00:25:28.905130 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259\": container with ID starting with 2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259 not found: ID does not exist" containerID="2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259" Mar 10 00:25:28 crc kubenswrapper[4994]: I0310 00:25:28.905232 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259"} err="failed to get container status \"2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259\": rpc error: code = NotFound desc = could not find container \"2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259\": container with ID starting with 2ffbe200dae8c537977e396e10397409f7e1d6544955c4b5927be8af1e713259 not found: ID does not exist" Mar 10 00:25:29 crc kubenswrapper[4994]: I0310 00:25:29.828665 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9zw" event={"ID":"a3025eba-bc72-4841-8d11-e07912a08204","Type":"ContainerStarted","Data":"5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994"} Mar 10 00:25:29 crc kubenswrapper[4994]: I0310 00:25:29.849975 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wb9zw" podStartSLOduration=2.349913155 podStartE2EDuration="4.849954514s" podCreationTimestamp="2026-03-10 00:25:25 +0000 UTC" firstStartedPulling="2026-03-10 00:25:26.806323274 +0000 UTC m=+1140.980030013" lastFinishedPulling="2026-03-10 00:25:29.306364623 +0000 UTC m=+1143.480071372" observedRunningTime="2026-03-10 00:25:29.848383631 +0000 UTC m=+1144.022090440" watchObservedRunningTime="2026-03-10 00:25:29.849954514 +0000 UTC m=+1144.023661273" Mar 10 00:25:30 crc kubenswrapper[4994]: I0310 00:25:30.560591 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" path="/var/lib/kubelet/pods/6cedd32c-f8fb-4e43-b82e-57c0ae8d384a/volumes" Mar 10 00:25:35 crc kubenswrapper[4994]: I0310 00:25:35.550772 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:35 crc kubenswrapper[4994]: I0310 00:25:35.551254 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:35 crc kubenswrapper[4994]: I0310 00:25:35.625055 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:35 crc kubenswrapper[4994]: I0310 00:25:35.924855 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:36 crc kubenswrapper[4994]: I0310 00:25:36.310031 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wb9zw"] Mar 10 00:25:37 crc kubenswrapper[4994]: I0310 00:25:37.891618 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wb9zw" podUID="a3025eba-bc72-4841-8d11-e07912a08204" containerName="registry-server" containerID="cri-o://5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994" gracePeriod=2 Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.342610 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.440671 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg5hc\" (UniqueName: \"kubernetes.io/projected/a3025eba-bc72-4841-8d11-e07912a08204-kube-api-access-xg5hc\") pod \"a3025eba-bc72-4841-8d11-e07912a08204\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.440905 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-catalog-content\") pod \"a3025eba-bc72-4841-8d11-e07912a08204\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.441028 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-utilities\") pod \"a3025eba-bc72-4841-8d11-e07912a08204\" (UID: \"a3025eba-bc72-4841-8d11-e07912a08204\") " Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.442280 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-utilities" (OuterVolumeSpecName: "utilities") pod "a3025eba-bc72-4841-8d11-e07912a08204" (UID: "a3025eba-bc72-4841-8d11-e07912a08204"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.452197 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3025eba-bc72-4841-8d11-e07912a08204-kube-api-access-xg5hc" (OuterVolumeSpecName: "kube-api-access-xg5hc") pod "a3025eba-bc72-4841-8d11-e07912a08204" (UID: "a3025eba-bc72-4841-8d11-e07912a08204"). InnerVolumeSpecName "kube-api-access-xg5hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.543384 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg5hc\" (UniqueName: \"kubernetes.io/projected/a3025eba-bc72-4841-8d11-e07912a08204-kube-api-access-xg5hc\") on node \"crc\" DevicePath \"\"" Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.543430 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.924963 4994 generic.go:334] "Generic (PLEG): container finished" podID="a3025eba-bc72-4841-8d11-e07912a08204" containerID="5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994" exitCode=0 Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.925070 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9zw" event={"ID":"a3025eba-bc72-4841-8d11-e07912a08204","Type":"ContainerDied","Data":"5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994"} Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.925161 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb9zw" event={"ID":"a3025eba-bc72-4841-8d11-e07912a08204","Type":"ContainerDied","Data":"a6eab4ce936f87d49ced6dfd2bb1e60e9d0032bca1642da7e1ee18e34523c380"} Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.925153 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb9zw" Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.925198 4994 scope.go:117] "RemoveContainer" containerID="5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994" Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.947824 4994 scope.go:117] "RemoveContainer" containerID="07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8" Mar 10 00:25:38 crc kubenswrapper[4994]: I0310 00:25:38.973803 4994 scope.go:117] "RemoveContainer" containerID="3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4" Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.022838 4994 scope.go:117] "RemoveContainer" containerID="5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994" Mar 10 00:25:39 crc kubenswrapper[4994]: E0310 00:25:39.023896 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994\": container with ID starting with 5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994 not found: ID does not exist" containerID="5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994" Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.023967 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994"} err="failed to get container status \"5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994\": rpc error: code = NotFound desc = could not find container \"5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994\": container with ID starting with 5a9a07337dbcc927c36608f64f1e7b1e4780ebcbd08de41e8e7a13502c359994 not found: ID does not exist" Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.024015 4994 scope.go:117] "RemoveContainer" containerID="07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8" Mar 10 00:25:39 crc kubenswrapper[4994]: E0310 00:25:39.025338 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8\": container with ID starting with 07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8 not found: ID does not exist" containerID="07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8" Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.025374 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8"} err="failed to get container status \"07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8\": rpc error: code = NotFound desc = could not find container \"07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8\": container with ID starting with 07778fd8836d7962c568af73031a8abfc14928df68fdb2acb6e31d5db3e67ec8 not found: ID does not exist" Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.025399 4994 scope.go:117] "RemoveContainer" containerID="3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4" Mar 10 00:25:39 crc kubenswrapper[4994]: E0310 00:25:39.025843 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4\": container with ID starting with 3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4 not found: ID does not exist" containerID="3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4" Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.025863 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4"} err="failed to get container status \"3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4\": rpc error: code = NotFound desc = could not find container \"3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4\": container with ID starting with 3917e3791f6079fb882641caf5ac3e973f3541b87b097510ebfa185209a21de4 not found: ID does not exist" Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.272330 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3025eba-bc72-4841-8d11-e07912a08204" (UID: "a3025eba-bc72-4841-8d11-e07912a08204"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.355366 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3025eba-bc72-4841-8d11-e07912a08204-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.582946 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wb9zw"] Mar 10 00:25:39 crc kubenswrapper[4994]: I0310 00:25:39.597258 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wb9zw"] Mar 10 00:25:40 crc kubenswrapper[4994]: I0310 00:25:40.561896 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3025eba-bc72-4841-8d11-e07912a08204" path="/var/lib/kubelet/pods/a3025eba-bc72-4841-8d11-e07912a08204/volumes" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.130044 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551706-trj4h"] Mar 10 00:26:00 crc kubenswrapper[4994]: E0310 00:26:00.130649 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerName="registry-server" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.130660 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerName="registry-server" Mar 10 00:26:00 crc kubenswrapper[4994]: E0310 00:26:00.130673 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3025eba-bc72-4841-8d11-e07912a08204" containerName="extract-utilities" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.130679 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3025eba-bc72-4841-8d11-e07912a08204" containerName="extract-utilities" Mar 10 00:26:00 crc kubenswrapper[4994]: E0310 00:26:00.130690 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerName="extract-utilities" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.130697 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerName="extract-utilities" Mar 10 00:26:00 crc kubenswrapper[4994]: E0310 00:26:00.130705 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3025eba-bc72-4841-8d11-e07912a08204" containerName="registry-server" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.130711 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3025eba-bc72-4841-8d11-e07912a08204" containerName="registry-server" Mar 10 00:26:00 crc kubenswrapper[4994]: E0310 00:26:00.130724 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerName="extract-content" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.130729 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerName="extract-content" Mar 10 00:26:00 crc kubenswrapper[4994]: E0310 00:26:00.130739 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3025eba-bc72-4841-8d11-e07912a08204" containerName="extract-content" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.130745 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3025eba-bc72-4841-8d11-e07912a08204" containerName="extract-content" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.130845 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cedd32c-f8fb-4e43-b82e-57c0ae8d384a" containerName="registry-server" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.130858 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3025eba-bc72-4841-8d11-e07912a08204" containerName="registry-server" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.131243 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551706-trj4h" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.134395 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.135092 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.141591 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.144646 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551706-trj4h"] Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.201353 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7thp\" (UniqueName: \"kubernetes.io/projected/c045a416-3fda-4dc3-b95a-15be10565d84-kube-api-access-p7thp\") pod \"auto-csr-approver-29551706-trj4h\" (UID: \"c045a416-3fda-4dc3-b95a-15be10565d84\") " pod="openshift-infra/auto-csr-approver-29551706-trj4h" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.302544 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7thp\" (UniqueName: \"kubernetes.io/projected/c045a416-3fda-4dc3-b95a-15be10565d84-kube-api-access-p7thp\") pod \"auto-csr-approver-29551706-trj4h\" (UID: \"c045a416-3fda-4dc3-b95a-15be10565d84\") " pod="openshift-infra/auto-csr-approver-29551706-trj4h" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.325756 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7thp\" (UniqueName: \"kubernetes.io/projected/c045a416-3fda-4dc3-b95a-15be10565d84-kube-api-access-p7thp\") pod \"auto-csr-approver-29551706-trj4h\" (UID: \"c045a416-3fda-4dc3-b95a-15be10565d84\") " pod="openshift-infra/auto-csr-approver-29551706-trj4h" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.449694 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551706-trj4h" Mar 10 00:26:00 crc kubenswrapper[4994]: I0310 00:26:00.877170 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551706-trj4h"] Mar 10 00:26:01 crc kubenswrapper[4994]: I0310 00:26:01.106861 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551706-trj4h" event={"ID":"c045a416-3fda-4dc3-b95a-15be10565d84","Type":"ContainerStarted","Data":"74f169ce0d92034fbe5865a3ef8ad30cc13e7968079cd4710eae07c024c594a6"} Mar 10 00:26:02 crc kubenswrapper[4994]: I0310 00:26:02.114379 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551706-trj4h" event={"ID":"c045a416-3fda-4dc3-b95a-15be10565d84","Type":"ContainerStarted","Data":"4533fb5052fb646d2bdd6d148243818a0541d9f71db00a897459732577383e18"} Mar 10 00:26:02 crc kubenswrapper[4994]: I0310 00:26:02.128342 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551706-trj4h" podStartSLOduration=1.242171514 podStartE2EDuration="2.128320753s" podCreationTimestamp="2026-03-10 00:26:00 +0000 UTC" firstStartedPulling="2026-03-10 00:26:00.888638372 +0000 UTC m=+1175.062345131" lastFinishedPulling="2026-03-10 00:26:01.774787621 +0000 UTC m=+1175.948494370" observedRunningTime="2026-03-10 00:26:02.127000808 +0000 UTC m=+1176.300707557" watchObservedRunningTime="2026-03-10 00:26:02.128320753 +0000 UTC m=+1176.302027512" Mar 10 00:26:03 crc kubenswrapper[4994]: I0310 00:26:03.123087 4994 generic.go:334] "Generic (PLEG): container finished" podID="c045a416-3fda-4dc3-b95a-15be10565d84" containerID="4533fb5052fb646d2bdd6d148243818a0541d9f71db00a897459732577383e18" exitCode=0 Mar 10 00:26:03 crc kubenswrapper[4994]: I0310 00:26:03.123142 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551706-trj4h" event={"ID":"c045a416-3fda-4dc3-b95a-15be10565d84","Type":"ContainerDied","Data":"4533fb5052fb646d2bdd6d148243818a0541d9f71db00a897459732577383e18"} Mar 10 00:26:04 crc kubenswrapper[4994]: I0310 00:26:04.437492 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551706-trj4h" Mar 10 00:26:04 crc kubenswrapper[4994]: I0310 00:26:04.555448 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7thp\" (UniqueName: \"kubernetes.io/projected/c045a416-3fda-4dc3-b95a-15be10565d84-kube-api-access-p7thp\") pod \"c045a416-3fda-4dc3-b95a-15be10565d84\" (UID: \"c045a416-3fda-4dc3-b95a-15be10565d84\") " Mar 10 00:26:04 crc kubenswrapper[4994]: I0310 00:26:04.573055 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c045a416-3fda-4dc3-b95a-15be10565d84-kube-api-access-p7thp" (OuterVolumeSpecName: "kube-api-access-p7thp") pod "c045a416-3fda-4dc3-b95a-15be10565d84" (UID: "c045a416-3fda-4dc3-b95a-15be10565d84"). InnerVolumeSpecName "kube-api-access-p7thp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:26:04 crc kubenswrapper[4994]: I0310 00:26:04.657001 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7thp\" (UniqueName: \"kubernetes.io/projected/c045a416-3fda-4dc3-b95a-15be10565d84-kube-api-access-p7thp\") on node \"crc\" DevicePath \"\"" Mar 10 00:26:05 crc kubenswrapper[4994]: I0310 00:26:05.139059 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551706-trj4h" event={"ID":"c045a416-3fda-4dc3-b95a-15be10565d84","Type":"ContainerDied","Data":"74f169ce0d92034fbe5865a3ef8ad30cc13e7968079cd4710eae07c024c594a6"} Mar 10 00:26:05 crc kubenswrapper[4994]: I0310 00:26:05.139103 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74f169ce0d92034fbe5865a3ef8ad30cc13e7968079cd4710eae07c024c594a6" Mar 10 00:26:05 crc kubenswrapper[4994]: I0310 00:26:05.139162 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551706-trj4h" Mar 10 00:26:05 crc kubenswrapper[4994]: I0310 00:26:05.208310 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551700-9pnx5"] Mar 10 00:26:05 crc kubenswrapper[4994]: I0310 00:26:05.217434 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551700-9pnx5"] Mar 10 00:26:06 crc kubenswrapper[4994]: I0310 00:26:06.562536 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e07bbe9c-f27a-4256-82ba-3adc771e2ebd" path="/var/lib/kubelet/pods/e07bbe9c-f27a-4256-82ba-3adc771e2ebd/volumes" Mar 10 00:26:21 crc kubenswrapper[4994]: I0310 00:26:21.575184 4994 scope.go:117] "RemoveContainer" containerID="7f305cfa821f31b484905a2d361cdfc46f777a5744001baccc6a559f30eb2409" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.572272 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dlw24"] Mar 10 00:27:23 crc kubenswrapper[4994]: E0310 00:27:23.573227 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c045a416-3fda-4dc3-b95a-15be10565d84" containerName="oc" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.573248 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="c045a416-3fda-4dc3-b95a-15be10565d84" containerName="oc" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.573437 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="c045a416-3fda-4dc3-b95a-15be10565d84" containerName="oc" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.574745 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.595649 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dlw24"] Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.698442 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-utilities\") pod \"certified-operators-dlw24\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.698548 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmvcl\" (UniqueName: \"kubernetes.io/projected/10b2465a-61c4-4f13-9649-04138927dd46-kube-api-access-cmvcl\") pod \"certified-operators-dlw24\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.699236 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-catalog-content\") pod \"certified-operators-dlw24\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.800913 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmvcl\" (UniqueName: \"kubernetes.io/projected/10b2465a-61c4-4f13-9649-04138927dd46-kube-api-access-cmvcl\") pod \"certified-operators-dlw24\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.801084 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-catalog-content\") pod \"certified-operators-dlw24\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.801177 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-utilities\") pod \"certified-operators-dlw24\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.801681 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-utilities\") pod \"certified-operators-dlw24\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.801765 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-catalog-content\") pod \"certified-operators-dlw24\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.820218 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmvcl\" (UniqueName: \"kubernetes.io/projected/10b2465a-61c4-4f13-9649-04138927dd46-kube-api-access-cmvcl\") pod \"certified-operators-dlw24\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:23 crc kubenswrapper[4994]: I0310 00:27:23.897001 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:24 crc kubenswrapper[4994]: I0310 00:27:24.414087 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dlw24"] Mar 10 00:27:24 crc kubenswrapper[4994]: I0310 00:27:24.744223 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlw24" event={"ID":"10b2465a-61c4-4f13-9649-04138927dd46","Type":"ContainerStarted","Data":"ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4"} Mar 10 00:27:24 crc kubenswrapper[4994]: I0310 00:27:24.744465 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlw24" event={"ID":"10b2465a-61c4-4f13-9649-04138927dd46","Type":"ContainerStarted","Data":"a6ebce8a9cc9b04bd5530ce4f4161d48c32326fbdbfa53b02c09cc0f94ae4154"} Mar 10 00:27:25 crc kubenswrapper[4994]: I0310 00:27:25.755806 4994 generic.go:334] "Generic (PLEG): container finished" podID="10b2465a-61c4-4f13-9649-04138927dd46" containerID="ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4" exitCode=0 Mar 10 00:27:25 crc kubenswrapper[4994]: I0310 00:27:25.755865 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlw24" event={"ID":"10b2465a-61c4-4f13-9649-04138927dd46","Type":"ContainerDied","Data":"ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4"} Mar 10 00:27:27 crc kubenswrapper[4994]: I0310 00:27:27.772140 4994 generic.go:334] "Generic (PLEG): container finished" podID="10b2465a-61c4-4f13-9649-04138927dd46" containerID="c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff" exitCode=0 Mar 10 00:27:27 crc kubenswrapper[4994]: I0310 00:27:27.772296 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlw24" event={"ID":"10b2465a-61c4-4f13-9649-04138927dd46","Type":"ContainerDied","Data":"c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff"} Mar 10 00:27:30 crc kubenswrapper[4994]: I0310 00:27:30.796345 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlw24" event={"ID":"10b2465a-61c4-4f13-9649-04138927dd46","Type":"ContainerStarted","Data":"b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141"} Mar 10 00:27:30 crc kubenswrapper[4994]: I0310 00:27:30.820691 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dlw24" podStartSLOduration=3.983302392 podStartE2EDuration="7.82065899s" podCreationTimestamp="2026-03-10 00:27:23 +0000 UTC" firstStartedPulling="2026-03-10 00:27:25.75875283 +0000 UTC m=+1259.932459619" lastFinishedPulling="2026-03-10 00:27:29.596109458 +0000 UTC m=+1263.769816217" observedRunningTime="2026-03-10 00:27:30.818288762 +0000 UTC m=+1264.991995531" watchObservedRunningTime="2026-03-10 00:27:30.82065899 +0000 UTC m=+1264.994365779" Mar 10 00:27:33 crc kubenswrapper[4994]: I0310 00:27:33.898605 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:33 crc kubenswrapper[4994]: I0310 00:27:33.898912 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:34 crc kubenswrapper[4994]: I0310 00:27:34.941647 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dlw24" podUID="10b2465a-61c4-4f13-9649-04138927dd46" containerName="registry-server" probeResult="failure" output=< Mar 10 00:27:34 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:27:34 crc kubenswrapper[4994]: > Mar 10 00:27:43 crc kubenswrapper[4994]: I0310 00:27:43.975987 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:44 crc kubenswrapper[4994]: I0310 00:27:44.031162 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:44 crc kubenswrapper[4994]: I0310 00:27:44.219435 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dlw24"] Mar 10 00:27:45 crc kubenswrapper[4994]: I0310 00:27:45.906991 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dlw24" podUID="10b2465a-61c4-4f13-9649-04138927dd46" containerName="registry-server" containerID="cri-o://b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141" gracePeriod=2 Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.336319 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.432322 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-catalog-content\") pod \"10b2465a-61c4-4f13-9649-04138927dd46\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.432434 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmvcl\" (UniqueName: \"kubernetes.io/projected/10b2465a-61c4-4f13-9649-04138927dd46-kube-api-access-cmvcl\") pod \"10b2465a-61c4-4f13-9649-04138927dd46\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.432480 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-utilities\") pod \"10b2465a-61c4-4f13-9649-04138927dd46\" (UID: \"10b2465a-61c4-4f13-9649-04138927dd46\") " Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.433606 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-utilities" (OuterVolumeSpecName: "utilities") pod "10b2465a-61c4-4f13-9649-04138927dd46" (UID: "10b2465a-61c4-4f13-9649-04138927dd46"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.437130 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b2465a-61c4-4f13-9649-04138927dd46-kube-api-access-cmvcl" (OuterVolumeSpecName: "kube-api-access-cmvcl") pod "10b2465a-61c4-4f13-9649-04138927dd46" (UID: "10b2465a-61c4-4f13-9649-04138927dd46"). InnerVolumeSpecName "kube-api-access-cmvcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.483491 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10b2465a-61c4-4f13-9649-04138927dd46" (UID: "10b2465a-61c4-4f13-9649-04138927dd46"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.534545 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.534586 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmvcl\" (UniqueName: \"kubernetes.io/projected/10b2465a-61c4-4f13-9649-04138927dd46-kube-api-access-cmvcl\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.534598 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b2465a-61c4-4f13-9649-04138927dd46-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.914977 4994 generic.go:334] "Generic (PLEG): container finished" podID="10b2465a-61c4-4f13-9649-04138927dd46" containerID="b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141" exitCode=0 Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.915065 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlw24" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.915072 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlw24" event={"ID":"10b2465a-61c4-4f13-9649-04138927dd46","Type":"ContainerDied","Data":"b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141"} Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.915344 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlw24" event={"ID":"10b2465a-61c4-4f13-9649-04138927dd46","Type":"ContainerDied","Data":"a6ebce8a9cc9b04bd5530ce4f4161d48c32326fbdbfa53b02c09cc0f94ae4154"} Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.915363 4994 scope.go:117] "RemoveContainer" containerID="b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.932417 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dlw24"] Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.934912 4994 scope.go:117] "RemoveContainer" containerID="c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.938070 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dlw24"] Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.952789 4994 scope.go:117] "RemoveContainer" containerID="ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.977400 4994 scope.go:117] "RemoveContainer" containerID="b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141" Mar 10 00:27:46 crc kubenswrapper[4994]: E0310 00:27:46.977839 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141\": container with ID starting with b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141 not found: ID does not exist" containerID="b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.977893 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141"} err="failed to get container status \"b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141\": rpc error: code = NotFound desc = could not find container \"b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141\": container with ID starting with b5eae0fc639ec3f6f6983107829045ca2794ac47ff26d5b3fee8db36f9170141 not found: ID does not exist" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.977920 4994 scope.go:117] "RemoveContainer" containerID="c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff" Mar 10 00:27:46 crc kubenswrapper[4994]: E0310 00:27:46.978341 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff\": container with ID starting with c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff not found: ID does not exist" containerID="c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.978369 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff"} err="failed to get container status \"c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff\": rpc error: code = NotFound desc = could not find container \"c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff\": container with ID starting with c83b51f79a7d25c750a3465e3b0efa9b4e7fc856c61e123048fe56e0eb4b45ff not found: ID does not exist" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.978394 4994 scope.go:117] "RemoveContainer" containerID="ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4" Mar 10 00:27:46 crc kubenswrapper[4994]: E0310 00:27:46.978791 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4\": container with ID starting with ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4 not found: ID does not exist" containerID="ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4" Mar 10 00:27:46 crc kubenswrapper[4994]: I0310 00:27:46.978815 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4"} err="failed to get container status \"ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4\": rpc error: code = NotFound desc = could not find container \"ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4\": container with ID starting with ddb507c952463e052bfb6fb64471335c41041d0018e399ffdd4816c7a57ec8d4 not found: ID does not exist" Mar 10 00:27:48 crc kubenswrapper[4994]: I0310 00:27:48.565338 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10b2465a-61c4-4f13-9649-04138927dd46" path="/var/lib/kubelet/pods/10b2465a-61c4-4f13-9649-04138927dd46/volumes" Mar 10 00:27:48 crc kubenswrapper[4994]: I0310 00:27:48.892667 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:27:48 crc kubenswrapper[4994]: I0310 00:27:48.892758 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:27:57 crc kubenswrapper[4994]: I0310 00:27:57.008590 4994 generic.go:334] "Generic (PLEG): container finished" podID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerID="6966590713cea1dd54724874068e8b954718fd9eea114677b3e7a90c5394c8c0" exitCode=0 Mar 10 00:27:57 crc kubenswrapper[4994]: I0310 00:27:57.008718 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"45ce1682-4d9a-4e51-ae93-a1832751d811","Type":"ContainerDied","Data":"6966590713cea1dd54724874068e8b954718fd9eea114677b3e7a90c5394c8c0"} Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.294831 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.493200 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-node-pullsecrets\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.493245 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-buildcachedir\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.493308 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-pull\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.493365 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.493381 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.493434 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-root\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.493474 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mgnf\" (UniqueName: \"kubernetes.io/projected/45ce1682-4d9a-4e51-ae93-a1832751d811-kube-api-access-7mgnf\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.493503 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-build-blob-cache\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.493521 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-proxy-ca-bundles\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.494626 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-run\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.494699 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-system-configs\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.494767 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-push\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.494797 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-ca-bundles\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.494822 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-buildworkdir\") pod \"45ce1682-4d9a-4e51-ae93-a1832751d811\" (UID: \"45ce1682-4d9a-4e51-ae93-a1832751d811\") " Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.494239 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.495666 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.496533 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.501457 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45ce1682-4d9a-4e51-ae93-a1832751d811-kube-api-access-7mgnf" (OuterVolumeSpecName: "kube-api-access-7mgnf") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "kube-api-access-7mgnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.502095 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.503900 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.506276 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.506313 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.506328 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.506349 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.506362 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45ce1682-4d9a-4e51-ae93-a1832751d811-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.514784 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.520412 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.607154 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.607191 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45ce1682-4d9a-4e51-ae93-a1832751d811-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.607203 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.607216 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45ce1682-4d9a-4e51-ae93-a1832751d811-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:58 crc kubenswrapper[4994]: I0310 00:27:58.607228 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mgnf\" (UniqueName: \"kubernetes.io/projected/45ce1682-4d9a-4e51-ae93-a1832751d811-kube-api-access-7mgnf\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:59 crc kubenswrapper[4994]: I0310 00:27:59.029509 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"45ce1682-4d9a-4e51-ae93-a1832751d811","Type":"ContainerDied","Data":"4cd2b54ef27177ce7bd06c1a0c62cb99ddedbaaaa4b7415ca746e6d6a53589fe"} Mar 10 00:27:59 crc kubenswrapper[4994]: I0310 00:27:59.029588 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cd2b54ef27177ce7bd06c1a0c62cb99ddedbaaaa4b7415ca746e6d6a53589fe" Mar 10 00:27:59 crc kubenswrapper[4994]: I0310 00:27:59.029657 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.163464 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551708-mcgcl"] Mar 10 00:28:00 crc kubenswrapper[4994]: E0310 00:28:00.164227 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerName="docker-build" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.164249 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerName="docker-build" Mar 10 00:28:00 crc kubenswrapper[4994]: E0310 00:28:00.164266 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b2465a-61c4-4f13-9649-04138927dd46" containerName="extract-content" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.164277 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b2465a-61c4-4f13-9649-04138927dd46" containerName="extract-content" Mar 10 00:28:00 crc kubenswrapper[4994]: E0310 00:28:00.164291 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerName="git-clone" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.164301 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerName="git-clone" Mar 10 00:28:00 crc kubenswrapper[4994]: E0310 00:28:00.164319 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerName="manage-dockerfile" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.164331 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerName="manage-dockerfile" Mar 10 00:28:00 crc kubenswrapper[4994]: E0310 00:28:00.164346 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b2465a-61c4-4f13-9649-04138927dd46" containerName="registry-server" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.164356 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b2465a-61c4-4f13-9649-04138927dd46" containerName="registry-server" Mar 10 00:28:00 crc kubenswrapper[4994]: E0310 00:28:00.164376 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b2465a-61c4-4f13-9649-04138927dd46" containerName="extract-utilities" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.164386 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b2465a-61c4-4f13-9649-04138927dd46" containerName="extract-utilities" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.164568 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="10b2465a-61c4-4f13-9649-04138927dd46" containerName="registry-server" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.164594 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="45ce1682-4d9a-4e51-ae93-a1832751d811" containerName="docker-build" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.165263 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551708-mcgcl" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.168175 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.168345 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.168690 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.173851 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551708-mcgcl"] Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.247325 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.257812 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n85sf\" (UniqueName: \"kubernetes.io/projected/79b6ae72-9c1a-4191-84af-d06b0155e244-kube-api-access-n85sf\") pod \"auto-csr-approver-29551708-mcgcl\" (UID: \"79b6ae72-9c1a-4191-84af-d06b0155e244\") " pod="openshift-infra/auto-csr-approver-29551708-mcgcl" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.258015 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.359518 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n85sf\" (UniqueName: \"kubernetes.io/projected/79b6ae72-9c1a-4191-84af-d06b0155e244-kube-api-access-n85sf\") pod \"auto-csr-approver-29551708-mcgcl\" (UID: \"79b6ae72-9c1a-4191-84af-d06b0155e244\") " pod="openshift-infra/auto-csr-approver-29551708-mcgcl" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.382822 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n85sf\" (UniqueName: \"kubernetes.io/projected/79b6ae72-9c1a-4191-84af-d06b0155e244-kube-api-access-n85sf\") pod \"auto-csr-approver-29551708-mcgcl\" (UID: \"79b6ae72-9c1a-4191-84af-d06b0155e244\") " pod="openshift-infra/auto-csr-approver-29551708-mcgcl" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.494173 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551708-mcgcl" Mar 10 00:28:00 crc kubenswrapper[4994]: I0310 00:28:00.773863 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551708-mcgcl"] Mar 10 00:28:00 crc kubenswrapper[4994]: W0310 00:28:00.777428 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79b6ae72_9c1a_4191_84af_d06b0155e244.slice/crio-77987b2d3a6178f38d6d4b99656280ec7ef9e91c4c82ee1cea49141ef3c6115a WatchSource:0}: Error finding container 77987b2d3a6178f38d6d4b99656280ec7ef9e91c4c82ee1cea49141ef3c6115a: Status 404 returned error can't find the container with id 77987b2d3a6178f38d6d4b99656280ec7ef9e91c4c82ee1cea49141ef3c6115a Mar 10 00:28:01 crc kubenswrapper[4994]: I0310 00:28:01.045275 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551708-mcgcl" event={"ID":"79b6ae72-9c1a-4191-84af-d06b0155e244","Type":"ContainerStarted","Data":"77987b2d3a6178f38d6d4b99656280ec7ef9e91c4c82ee1cea49141ef3c6115a"} Mar 10 00:28:01 crc kubenswrapper[4994]: I0310 00:28:01.883150 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "45ce1682-4d9a-4e51-ae93-a1832751d811" (UID: "45ce1682-4d9a-4e51-ae93-a1832751d811"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:01 crc kubenswrapper[4994]: I0310 00:28:01.896968 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45ce1682-4d9a-4e51-ae93-a1832751d811-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.529214 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.531955 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.542137 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-sys-config" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.543371 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-ca" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.545834 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-global-ca" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.559124 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.577331 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.604561 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j86pn\" (UniqueName: \"kubernetes.io/projected/febc2c12-11b4-423b-98f7-043a38b945e3-kube-api-access-j86pn\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.604614 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-push\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.604638 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.604660 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.604679 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.604801 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.604944 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.604987 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.605027 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.605078 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.605111 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.605138 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-pull\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.706970 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707026 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707057 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707078 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-pull\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707132 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j86pn\" (UniqueName: \"kubernetes.io/projected/febc2c12-11b4-423b-98f7-043a38b945e3-kube-api-access-j86pn\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707155 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-push\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707180 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707200 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707220 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707250 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707289 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707340 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707637 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.707902 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.708215 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.708434 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.708713 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.708769 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.709283 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.709325 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.709932 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.713631 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-push\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.713664 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-pull\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.738436 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j86pn\" (UniqueName: \"kubernetes.io/projected/febc2c12-11b4-423b-98f7-043a38b945e3-kube-api-access-j86pn\") pod \"sg-bridge-1-build\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:02 crc kubenswrapper[4994]: I0310 00:28:02.855731 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:03 crc kubenswrapper[4994]: I0310 00:28:03.072793 4994 generic.go:334] "Generic (PLEG): container finished" podID="79b6ae72-9c1a-4191-84af-d06b0155e244" containerID="76e4652e3cbbbc7dd950c967558f6551927fe404fc62cca88c145579e5829da9" exitCode=0 Mar 10 00:28:03 crc kubenswrapper[4994]: I0310 00:28:03.072903 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551708-mcgcl" event={"ID":"79b6ae72-9c1a-4191-84af-d06b0155e244","Type":"ContainerDied","Data":"76e4652e3cbbbc7dd950c967558f6551927fe404fc62cca88c145579e5829da9"} Mar 10 00:28:03 crc kubenswrapper[4994]: I0310 00:28:03.144191 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 10 00:28:03 crc kubenswrapper[4994]: W0310 00:28:03.152047 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfebc2c12_11b4_423b_98f7_043a38b945e3.slice/crio-bf0cd1c02ee41e3997d72ac822c22536731285fae7a54bbc141b19e514666ea9 WatchSource:0}: Error finding container bf0cd1c02ee41e3997d72ac822c22536731285fae7a54bbc141b19e514666ea9: Status 404 returned error can't find the container with id bf0cd1c02ee41e3997d72ac822c22536731285fae7a54bbc141b19e514666ea9 Mar 10 00:28:04 crc kubenswrapper[4994]: I0310 00:28:04.086411 4994 generic.go:334] "Generic (PLEG): container finished" podID="febc2c12-11b4-423b-98f7-043a38b945e3" containerID="a27ea216bf9458a64549c80f1db46605b4b76efa82db45780a652582c7511391" exitCode=0 Mar 10 00:28:04 crc kubenswrapper[4994]: I0310 00:28:04.086523 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"febc2c12-11b4-423b-98f7-043a38b945e3","Type":"ContainerDied","Data":"a27ea216bf9458a64549c80f1db46605b4b76efa82db45780a652582c7511391"} Mar 10 00:28:04 crc kubenswrapper[4994]: I0310 00:28:04.086596 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"febc2c12-11b4-423b-98f7-043a38b945e3","Type":"ContainerStarted","Data":"bf0cd1c02ee41e3997d72ac822c22536731285fae7a54bbc141b19e514666ea9"} Mar 10 00:28:04 crc kubenswrapper[4994]: I0310 00:28:04.437223 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551708-mcgcl" Mar 10 00:28:04 crc kubenswrapper[4994]: I0310 00:28:04.535635 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n85sf\" (UniqueName: \"kubernetes.io/projected/79b6ae72-9c1a-4191-84af-d06b0155e244-kube-api-access-n85sf\") pod \"79b6ae72-9c1a-4191-84af-d06b0155e244\" (UID: \"79b6ae72-9c1a-4191-84af-d06b0155e244\") " Mar 10 00:28:04 crc kubenswrapper[4994]: I0310 00:28:04.542310 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79b6ae72-9c1a-4191-84af-d06b0155e244-kube-api-access-n85sf" (OuterVolumeSpecName: "kube-api-access-n85sf") pod "79b6ae72-9c1a-4191-84af-d06b0155e244" (UID: "79b6ae72-9c1a-4191-84af-d06b0155e244"). InnerVolumeSpecName "kube-api-access-n85sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:28:04 crc kubenswrapper[4994]: I0310 00:28:04.637124 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n85sf\" (UniqueName: \"kubernetes.io/projected/79b6ae72-9c1a-4191-84af-d06b0155e244-kube-api-access-n85sf\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:05 crc kubenswrapper[4994]: I0310 00:28:05.096309 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551708-mcgcl" Mar 10 00:28:05 crc kubenswrapper[4994]: I0310 00:28:05.096272 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551708-mcgcl" event={"ID":"79b6ae72-9c1a-4191-84af-d06b0155e244","Type":"ContainerDied","Data":"77987b2d3a6178f38d6d4b99656280ec7ef9e91c4c82ee1cea49141ef3c6115a"} Mar 10 00:28:05 crc kubenswrapper[4994]: I0310 00:28:05.097793 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77987b2d3a6178f38d6d4b99656280ec7ef9e91c4c82ee1cea49141ef3c6115a" Mar 10 00:28:05 crc kubenswrapper[4994]: I0310 00:28:05.099156 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"febc2c12-11b4-423b-98f7-043a38b945e3","Type":"ContainerStarted","Data":"925f54c347107dfef1d6a968040c02d90a520675d87da19db00e9ab9a077bd4f"} Mar 10 00:28:05 crc kubenswrapper[4994]: I0310 00:28:05.136287 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=3.136265613 podStartE2EDuration="3.136265613s" podCreationTimestamp="2026-03-10 00:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:28:05.134232283 +0000 UTC m=+1299.307939052" watchObservedRunningTime="2026-03-10 00:28:05.136265613 +0000 UTC m=+1299.309972372" Mar 10 00:28:05 crc kubenswrapper[4994]: I0310 00:28:05.510119 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551702-qz8hb"] Mar 10 00:28:05 crc kubenswrapper[4994]: I0310 00:28:05.523319 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551702-qz8hb"] Mar 10 00:28:06 crc kubenswrapper[4994]: I0310 00:28:06.564614 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dbd271f-5f29-4221-bfbe-2274ce440c29" path="/var/lib/kubelet/pods/0dbd271f-5f29-4221-bfbe-2274ce440c29/volumes" Mar 10 00:28:12 crc kubenswrapper[4994]: I0310 00:28:12.853510 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 10 00:28:12 crc kubenswrapper[4994]: I0310 00:28:12.854270 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-bridge-1-build" podUID="febc2c12-11b4-423b-98f7-043a38b945e3" containerName="docker-build" containerID="cri-o://925f54c347107dfef1d6a968040c02d90a520675d87da19db00e9ab9a077bd4f" gracePeriod=30 Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.157542 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_febc2c12-11b4-423b-98f7-043a38b945e3/docker-build/0.log" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.158273 4994 generic.go:334] "Generic (PLEG): container finished" podID="febc2c12-11b4-423b-98f7-043a38b945e3" containerID="925f54c347107dfef1d6a968040c02d90a520675d87da19db00e9ab9a077bd4f" exitCode=1 Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.158318 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"febc2c12-11b4-423b-98f7-043a38b945e3","Type":"ContainerDied","Data":"925f54c347107dfef1d6a968040c02d90a520675d87da19db00e9ab9a077bd4f"} Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.352921 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_febc2c12-11b4-423b-98f7-043a38b945e3/docker-build/0.log" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.353513 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.468782 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-node-pullsecrets\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.468900 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-ca-bundles\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.468954 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-buildworkdir\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.468970 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.468992 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-root\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.469144 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-pull\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.469207 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j86pn\" (UniqueName: \"kubernetes.io/projected/febc2c12-11b4-423b-98f7-043a38b945e3-kube-api-access-j86pn\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.469252 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-buildcachedir\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.469354 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-run\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.469405 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-proxy-ca-bundles\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.469443 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-system-configs\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.469477 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-build-blob-cache\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.469527 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-push\") pod \"febc2c12-11b4-423b-98f7-043a38b945e3\" (UID: \"febc2c12-11b4-423b-98f7-043a38b945e3\") " Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.470063 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.470103 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.471224 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.471297 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.471609 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.472584 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.481079 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/febc2c12-11b4-423b-98f7-043a38b945e3-kube-api-access-j86pn" (OuterVolumeSpecName: "kube-api-access-j86pn") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "kube-api-access-j86pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.481109 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.481170 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.471720 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.571443 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.571489 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.571506 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.571522 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.571539 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/febc2c12-11b4-423b-98f7-043a38b945e3-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.571555 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.571570 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/febc2c12-11b4-423b-98f7-043a38b945e3-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.571586 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j86pn\" (UniqueName: \"kubernetes.io/projected/febc2c12-11b4-423b-98f7-043a38b945e3-kube-api-access-j86pn\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.571603 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/febc2c12-11b4-423b-98f7-043a38b945e3-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.578825 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.673503 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.888925 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "febc2c12-11b4-423b-98f7-043a38b945e3" (UID: "febc2c12-11b4-423b-98f7-043a38b945e3"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:13 crc kubenswrapper[4994]: I0310 00:28:13.977303 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/febc2c12-11b4-423b-98f7-043a38b945e3-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.166982 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_febc2c12-11b4-423b-98f7-043a38b945e3/docker-build/0.log" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.167615 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"febc2c12-11b4-423b-98f7-043a38b945e3","Type":"ContainerDied","Data":"bf0cd1c02ee41e3997d72ac822c22536731285fae7a54bbc141b19e514666ea9"} Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.167687 4994 scope.go:117] "RemoveContainer" containerID="925f54c347107dfef1d6a968040c02d90a520675d87da19db00e9ab9a077bd4f" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.167739 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.191906 4994 scope.go:117] "RemoveContainer" containerID="a27ea216bf9458a64549c80f1db46605b4b76efa82db45780a652582c7511391" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.215660 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.221435 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.478553 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 10 00:28:14 crc kubenswrapper[4994]: E0310 00:28:14.478762 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="febc2c12-11b4-423b-98f7-043a38b945e3" containerName="manage-dockerfile" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.478773 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="febc2c12-11b4-423b-98f7-043a38b945e3" containerName="manage-dockerfile" Mar 10 00:28:14 crc kubenswrapper[4994]: E0310 00:28:14.478784 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b6ae72-9c1a-4191-84af-d06b0155e244" containerName="oc" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.478790 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b6ae72-9c1a-4191-84af-d06b0155e244" containerName="oc" Mar 10 00:28:14 crc kubenswrapper[4994]: E0310 00:28:14.478798 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="febc2c12-11b4-423b-98f7-043a38b945e3" containerName="docker-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.478804 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="febc2c12-11b4-423b-98f7-043a38b945e3" containerName="docker-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.478911 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="febc2c12-11b4-423b-98f7-043a38b945e3" containerName="docker-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.478927 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="79b6ae72-9c1a-4191-84af-d06b0155e244" containerName="oc" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.479683 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.481480 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.481829 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-ca" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.482291 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-global-ca" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.484397 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-sys-config" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.484930 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.484965 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485029 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485086 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485110 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485210 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485243 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-pull\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485266 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485323 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485364 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlxdw\" (UniqueName: \"kubernetes.io/projected/45372411-b93c-4485-8fea-d6802d98592f-kube-api-access-zlxdw\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485390 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-push\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.485433 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.503713 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.562333 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="febc2c12-11b4-423b-98f7-043a38b945e3" path="/var/lib/kubelet/pods/febc2c12-11b4-423b-98f7-043a38b945e3/volumes" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587006 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587043 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-pull\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587059 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587087 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587131 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlxdw\" (UniqueName: \"kubernetes.io/projected/45372411-b93c-4485-8fea-d6802d98592f-kube-api-access-zlxdw\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587155 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-push\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587182 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587222 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587247 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587266 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587288 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587310 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587666 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587691 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587762 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.587965 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.588000 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.588382 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.588695 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.589007 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.590225 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.591598 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-push\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.598685 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-pull\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.604486 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlxdw\" (UniqueName: \"kubernetes.io/projected/45372411-b93c-4485-8fea-d6802d98592f-kube-api-access-zlxdw\") pod \"sg-bridge-2-build\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:14 crc kubenswrapper[4994]: I0310 00:28:14.794397 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:15 crc kubenswrapper[4994]: I0310 00:28:15.066089 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 10 00:28:15 crc kubenswrapper[4994]: I0310 00:28:15.176292 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"45372411-b93c-4485-8fea-d6802d98592f","Type":"ContainerStarted","Data":"ac5cbcff64507a3b220a97a744b7b6014bcc97611b25e2ef088c6ab089cef2c0"} Mar 10 00:28:16 crc kubenswrapper[4994]: I0310 00:28:16.190384 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"45372411-b93c-4485-8fea-d6802d98592f","Type":"ContainerStarted","Data":"848724b6dc463e3e8e91c8b36236fbf49ccdd2d3dd3583be8782337ddb94d9eb"} Mar 10 00:28:17 crc kubenswrapper[4994]: I0310 00:28:17.200074 4994 generic.go:334] "Generic (PLEG): container finished" podID="45372411-b93c-4485-8fea-d6802d98592f" containerID="848724b6dc463e3e8e91c8b36236fbf49ccdd2d3dd3583be8782337ddb94d9eb" exitCode=0 Mar 10 00:28:17 crc kubenswrapper[4994]: I0310 00:28:17.200155 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"45372411-b93c-4485-8fea-d6802d98592f","Type":"ContainerDied","Data":"848724b6dc463e3e8e91c8b36236fbf49ccdd2d3dd3583be8782337ddb94d9eb"} Mar 10 00:28:18 crc kubenswrapper[4994]: I0310 00:28:18.210566 4994 generic.go:334] "Generic (PLEG): container finished" podID="45372411-b93c-4485-8fea-d6802d98592f" containerID="257cd17c481a0d71313837e0519c20670a47ba4ec6bde6881a97a4383e647ea5" exitCode=0 Mar 10 00:28:18 crc kubenswrapper[4994]: I0310 00:28:18.210645 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"45372411-b93c-4485-8fea-d6802d98592f","Type":"ContainerDied","Data":"257cd17c481a0d71313837e0519c20670a47ba4ec6bde6881a97a4383e647ea5"} Mar 10 00:28:18 crc kubenswrapper[4994]: I0310 00:28:18.277847 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_45372411-b93c-4485-8fea-d6802d98592f/manage-dockerfile/0.log" Mar 10 00:28:18 crc kubenswrapper[4994]: I0310 00:28:18.893198 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:28:18 crc kubenswrapper[4994]: I0310 00:28:18.893680 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:28:19 crc kubenswrapper[4994]: I0310 00:28:19.223108 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"45372411-b93c-4485-8fea-d6802d98592f","Type":"ContainerStarted","Data":"e48b2d6a05a6181bd819f2462e0e5a97bbf8486f23ce25be1a9818712e37ac69"} Mar 10 00:28:19 crc kubenswrapper[4994]: I0310 00:28:19.270715 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-2-build" podStartSLOduration=5.270684798 podStartE2EDuration="5.270684798s" podCreationTimestamp="2026-03-10 00:28:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:28:19.266813363 +0000 UTC m=+1313.440520162" watchObservedRunningTime="2026-03-10 00:28:19.270684798 +0000 UTC m=+1313.444391577" Mar 10 00:28:21 crc kubenswrapper[4994]: I0310 00:28:21.662254 4994 scope.go:117] "RemoveContainer" containerID="579778a9ca15bde63cc28eb094f98e7025e92dc67fe3a2f215a9a028c2966910" Mar 10 00:28:48 crc kubenswrapper[4994]: I0310 00:28:48.892392 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:28:48 crc kubenswrapper[4994]: I0310 00:28:48.893128 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:28:48 crc kubenswrapper[4994]: I0310 00:28:48.893188 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:28:48 crc kubenswrapper[4994]: I0310 00:28:48.894141 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"779b11783e082c837efecec96b026cd3be87293636b7184dfd3efe1ae146c491"} pod="openshift-machine-config-operator/machine-config-daemon-kfljj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:28:48 crc kubenswrapper[4994]: I0310 00:28:48.894234 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" containerID="cri-o://779b11783e082c837efecec96b026cd3be87293636b7184dfd3efe1ae146c491" gracePeriod=600 Mar 10 00:28:49 crc kubenswrapper[4994]: I0310 00:28:49.484526 4994 generic.go:334] "Generic (PLEG): container finished" podID="ced5d66d-39df-4267-b801-e1e60d517ace" containerID="779b11783e082c837efecec96b026cd3be87293636b7184dfd3efe1ae146c491" exitCode=0 Mar 10 00:28:49 crc kubenswrapper[4994]: I0310 00:28:49.484603 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerDied","Data":"779b11783e082c837efecec96b026cd3be87293636b7184dfd3efe1ae146c491"} Mar 10 00:28:49 crc kubenswrapper[4994]: I0310 00:28:49.485144 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"d8d935625d60ec1fe79acd428aa0c427cb2a184ba8e0a37f25ea8bb9485e5629"} Mar 10 00:28:49 crc kubenswrapper[4994]: I0310 00:28:49.485188 4994 scope.go:117] "RemoveContainer" containerID="4df6fccd58598d7ef69c17858b0dc84c63fc5b8f887c3889c08c1cb8b68f120e" Mar 10 00:29:05 crc kubenswrapper[4994]: I0310 00:29:05.635627 4994 generic.go:334] "Generic (PLEG): container finished" podID="45372411-b93c-4485-8fea-d6802d98592f" containerID="e48b2d6a05a6181bd819f2462e0e5a97bbf8486f23ce25be1a9818712e37ac69" exitCode=0 Mar 10 00:29:05 crc kubenswrapper[4994]: I0310 00:29:05.635702 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"45372411-b93c-4485-8fea-d6802d98592f","Type":"ContainerDied","Data":"e48b2d6a05a6181bd819f2462e0e5a97bbf8486f23ce25be1a9818712e37ac69"} Mar 10 00:29:06 crc kubenswrapper[4994]: I0310 00:29:06.957655 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.127320 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-buildcachedir\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.127449 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-node-pullsecrets\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.127503 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-build-blob-cache\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.127588 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-system-configs\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.127654 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-ca-bundles\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.127732 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-root\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.127784 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-pull\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.127854 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlxdw\" (UniqueName: \"kubernetes.io/projected/45372411-b93c-4485-8fea-d6802d98592f-kube-api-access-zlxdw\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.130140 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-buildworkdir\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.130207 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-proxy-ca-bundles\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.128142 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.130267 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-push\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.130325 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-run\") pod \"45372411-b93c-4485-8fea-d6802d98592f\" (UID: \"45372411-b93c-4485-8fea-d6802d98592f\") " Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.128189 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.129570 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.129967 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.130844 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.130865 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/45372411-b93c-4485-8fea-d6802d98592f-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.130911 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.130932 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.131284 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.131620 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.136165 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45372411-b93c-4485-8fea-d6802d98592f-kube-api-access-zlxdw" (OuterVolumeSpecName: "kube-api-access-zlxdw") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "kube-api-access-zlxdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.136369 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.136429 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.232986 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.233033 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlxdw\" (UniqueName: \"kubernetes.io/projected/45372411-b93c-4485-8fea-d6802d98592f-kube-api-access-zlxdw\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.233052 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.233071 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45372411-b93c-4485-8fea-d6802d98592f-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.233089 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/45372411-b93c-4485-8fea-d6802d98592f-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.431270 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.436411 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.655252 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"45372411-b93c-4485-8fea-d6802d98592f","Type":"ContainerDied","Data":"ac5cbcff64507a3b220a97a744b7b6014bcc97611b25e2ef088c6ab089cef2c0"} Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.655328 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac5cbcff64507a3b220a97a744b7b6014bcc97611b25e2ef088c6ab089cef2c0" Mar 10 00:29:07 crc kubenswrapper[4994]: I0310 00:29:07.655368 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 10 00:29:09 crc kubenswrapper[4994]: I0310 00:29:09.799038 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:09 crc kubenswrapper[4994]: I0310 00:29:09.873799 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:09 crc kubenswrapper[4994]: I0310 00:29:09.952296 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "45372411-b93c-4485-8fea-d6802d98592f" (UID: "45372411-b93c-4485-8fea-d6802d98592f"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:09 crc kubenswrapper[4994]: I0310 00:29:09.975234 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/45372411-b93c-4485-8fea-d6802d98592f-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.303098 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 10 00:29:11 crc kubenswrapper[4994]: E0310 00:29:11.303674 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45372411-b93c-4485-8fea-d6802d98592f" containerName="docker-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.303689 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="45372411-b93c-4485-8fea-d6802d98592f" containerName="docker-build" Mar 10 00:29:11 crc kubenswrapper[4994]: E0310 00:29:11.303705 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45372411-b93c-4485-8fea-d6802d98592f" containerName="manage-dockerfile" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.303714 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="45372411-b93c-4485-8fea-d6802d98592f" containerName="manage-dockerfile" Mar 10 00:29:11 crc kubenswrapper[4994]: E0310 00:29:11.303726 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45372411-b93c-4485-8fea-d6802d98592f" containerName="git-clone" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.303734 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="45372411-b93c-4485-8fea-d6802d98592f" containerName="git-clone" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.303900 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="45372411-b93c-4485-8fea-d6802d98592f" containerName="docker-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.304600 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.307339 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-sys-config" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.307469 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.307761 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-global-ca" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.308945 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-ca" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.331258 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398361 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398425 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398458 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398480 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398519 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn7kb\" (UniqueName: \"kubernetes.io/projected/5673610d-32a0-4fe8-974d-e919df0dc6aa-kube-api-access-mn7kb\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398546 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398571 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398646 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398695 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398729 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398847 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.398983 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.500508 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.500580 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.500638 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.500664 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.500692 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.500716 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.500800 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.500861 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.500944 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn7kb\" (UniqueName: \"kubernetes.io/projected/5673610d-32a0-4fe8-974d-e919df0dc6aa-kube-api-access-mn7kb\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501332 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501365 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501379 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501428 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501465 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501507 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501552 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501599 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501813 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.501837 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.502143 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.502773 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.509626 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.515028 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.522365 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn7kb\" (UniqueName: \"kubernetes.io/projected/5673610d-32a0-4fe8-974d-e919df0dc6aa-kube-api-access-mn7kb\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:11 crc kubenswrapper[4994]: I0310 00:29:11.628260 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:12 crc kubenswrapper[4994]: I0310 00:29:12.120276 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 10 00:29:12 crc kubenswrapper[4994]: I0310 00:29:12.695663 4994 generic.go:334] "Generic (PLEG): container finished" podID="5673610d-32a0-4fe8-974d-e919df0dc6aa" containerID="8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362" exitCode=0 Mar 10 00:29:12 crc kubenswrapper[4994]: I0310 00:29:12.695927 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"5673610d-32a0-4fe8-974d-e919df0dc6aa","Type":"ContainerDied","Data":"8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362"} Mar 10 00:29:12 crc kubenswrapper[4994]: I0310 00:29:12.695958 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"5673610d-32a0-4fe8-974d-e919df0dc6aa","Type":"ContainerStarted","Data":"b4cf3f288f48594f960e028ba9e5003a8d1167d0c12c4cd7e1d79c3242ec5dda"} Mar 10 00:29:13 crc kubenswrapper[4994]: I0310 00:29:13.708939 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"5673610d-32a0-4fe8-974d-e919df0dc6aa","Type":"ContainerStarted","Data":"0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be"} Mar 10 00:29:13 crc kubenswrapper[4994]: I0310 00:29:13.744937 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=2.744905829 podStartE2EDuration="2.744905829s" podCreationTimestamp="2026-03-10 00:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:29:13.743462862 +0000 UTC m=+1367.917169681" watchObservedRunningTime="2026-03-10 00:29:13.744905829 +0000 UTC m=+1367.918612618" Mar 10 00:29:21 crc kubenswrapper[4994]: I0310 00:29:21.831417 4994 scope.go:117] "RemoveContainer" containerID="eb388fa374116b603771455669ec292f8665b546692a48ed5cd70354b73fe681" Mar 10 00:29:21 crc kubenswrapper[4994]: I0310 00:29:21.879067 4994 scope.go:117] "RemoveContainer" containerID="5b7a38adc0c6a78554c415c32e9eb6f227d97fceb20d2f07317cb0b1675be632" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.036252 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.036734 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="5673610d-32a0-4fe8-974d-e919df0dc6aa" containerName="docker-build" containerID="cri-o://0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be" gracePeriod=30 Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.445379 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_5673610d-32a0-4fe8-974d-e919df0dc6aa/docker-build/0.log" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.446172 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584395 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildcachedir\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584470 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-proxy-ca-bundles\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584541 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-system-configs\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584542 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584579 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-run\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584641 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildworkdir\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584691 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-root\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584753 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn7kb\" (UniqueName: \"kubernetes.io/projected/5673610d-32a0-4fe8-974d-e919df0dc6aa-kube-api-access-mn7kb\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584797 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-push\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584863 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-blob-cache\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584942 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-ca-bundles\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.584987 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-node-pullsecrets\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.585024 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-pull\") pod \"5673610d-32a0-4fe8-974d-e919df0dc6aa\" (UID: \"5673610d-32a0-4fe8-974d-e919df0dc6aa\") " Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.585392 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.585776 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.585792 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.586007 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.586460 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.586640 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.588168 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.594057 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.594276 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.594591 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5673610d-32a0-4fe8-974d-e919df0dc6aa-kube-api-access-mn7kb" (OuterVolumeSpecName: "kube-api-access-mn7kb") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "kube-api-access-mn7kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.662242 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687540 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687586 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5673610d-32a0-4fe8-974d-e919df0dc6aa-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687606 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687623 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687638 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687653 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687669 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687684 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn7kb\" (UniqueName: \"kubernetes.io/projected/5673610d-32a0-4fe8-974d-e919df0dc6aa-kube-api-access-mn7kb\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687698 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/5673610d-32a0-4fe8-974d-e919df0dc6aa-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.687712 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.784093 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_5673610d-32a0-4fe8-974d-e919df0dc6aa/docker-build/0.log" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.784530 4994 generic.go:334] "Generic (PLEG): container finished" podID="5673610d-32a0-4fe8-974d-e919df0dc6aa" containerID="0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be" exitCode=1 Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.784587 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"5673610d-32a0-4fe8-974d-e919df0dc6aa","Type":"ContainerDied","Data":"0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be"} Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.784614 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"5673610d-32a0-4fe8-974d-e919df0dc6aa","Type":"ContainerDied","Data":"b4cf3f288f48594f960e028ba9e5003a8d1167d0c12c4cd7e1d79c3242ec5dda"} Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.784635 4994 scope.go:117] "RemoveContainer" containerID="0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.784796 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.807314 4994 scope.go:117] "RemoveContainer" containerID="8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.835969 4994 scope.go:117] "RemoveContainer" containerID="0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be" Mar 10 00:29:22 crc kubenswrapper[4994]: E0310 00:29:22.837004 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be\": container with ID starting with 0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be not found: ID does not exist" containerID="0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.837051 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be"} err="failed to get container status \"0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be\": rpc error: code = NotFound desc = could not find container \"0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be\": container with ID starting with 0101afdca9a2da33d4934c97c511ef1661b88df35cca031dee80da26ee9bb3be not found: ID does not exist" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.837088 4994 scope.go:117] "RemoveContainer" containerID="8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362" Mar 10 00:29:22 crc kubenswrapper[4994]: E0310 00:29:22.837510 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362\": container with ID starting with 8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362 not found: ID does not exist" containerID="8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362" Mar 10 00:29:22 crc kubenswrapper[4994]: I0310 00:29:22.837709 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362"} err="failed to get container status \"8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362\": rpc error: code = NotFound desc = could not find container \"8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362\": container with ID starting with 8a24cf2a5506eac936363ce5898f478f5750a25f077be288c38211ab1350f362 not found: ID does not exist" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.012535 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "5673610d-32a0-4fe8-974d-e919df0dc6aa" (UID: "5673610d-32a0-4fe8-974d-e919df0dc6aa"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.095030 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5673610d-32a0-4fe8-974d-e919df0dc6aa-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.126381 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.134788 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.771618 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 10 00:29:23 crc kubenswrapper[4994]: E0310 00:29:23.772436 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5673610d-32a0-4fe8-974d-e919df0dc6aa" containerName="docker-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.772620 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="5673610d-32a0-4fe8-974d-e919df0dc6aa" containerName="docker-build" Mar 10 00:29:23 crc kubenswrapper[4994]: E0310 00:29:23.772805 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5673610d-32a0-4fe8-974d-e919df0dc6aa" containerName="manage-dockerfile" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.773025 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="5673610d-32a0-4fe8-974d-e919df0dc6aa" containerName="manage-dockerfile" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.773407 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="5673610d-32a0-4fe8-974d-e919df0dc6aa" containerName="docker-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.775249 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.779516 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-ca" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.779787 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-global-ca" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.779862 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-sys-config" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.780497 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.810051 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.907335 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.907399 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.907426 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.907645 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.907752 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.907781 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g8bl\" (UniqueName: \"kubernetes.io/projected/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-kube-api-access-7g8bl\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.907803 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.907826 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.907901 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.908020 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.908071 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:23 crc kubenswrapper[4994]: I0310 00:29:23.908158 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009215 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009260 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009287 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009304 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009336 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009366 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009390 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g8bl\" (UniqueName: \"kubernetes.io/projected/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-kube-api-access-7g8bl\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009415 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009432 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009461 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009504 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009537 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.009775 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.010064 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.010106 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.010417 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.010576 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.010679 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.010686 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.011033 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.011218 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.014307 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.015072 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.028640 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g8bl\" (UniqueName: \"kubernetes.io/projected/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-kube-api-access-7g8bl\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.100449 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.337751 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.564635 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5673610d-32a0-4fe8-974d-e919df0dc6aa" path="/var/lib/kubelet/pods/5673610d-32a0-4fe8-974d-e919df0dc6aa/volumes" Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.804676 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5","Type":"ContainerStarted","Data":"fbe6d9cc5efceee105fad5a2e933a33bc5f5468a2f7d89006b830968fada061c"} Mar 10 00:29:24 crc kubenswrapper[4994]: I0310 00:29:24.804721 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5","Type":"ContainerStarted","Data":"e73b4ce11c2c82beb195c98c6499aa1914f9f2f2e89ddf242434359b3b2d56d3"} Mar 10 00:29:25 crc kubenswrapper[4994]: I0310 00:29:25.813293 4994 generic.go:334] "Generic (PLEG): container finished" podID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerID="fbe6d9cc5efceee105fad5a2e933a33bc5f5468a2f7d89006b830968fada061c" exitCode=0 Mar 10 00:29:25 crc kubenswrapper[4994]: I0310 00:29:25.813353 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5","Type":"ContainerDied","Data":"fbe6d9cc5efceee105fad5a2e933a33bc5f5468a2f7d89006b830968fada061c"} Mar 10 00:29:26 crc kubenswrapper[4994]: I0310 00:29:26.823529 4994 generic.go:334] "Generic (PLEG): container finished" podID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerID="8fb9b76320be4c5299fb59a812c11fb1baa5de358024e62cc8edfe3261bbeaae" exitCode=0 Mar 10 00:29:26 crc kubenswrapper[4994]: I0310 00:29:26.823622 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5","Type":"ContainerDied","Data":"8fb9b76320be4c5299fb59a812c11fb1baa5de358024e62cc8edfe3261bbeaae"} Mar 10 00:29:26 crc kubenswrapper[4994]: I0310 00:29:26.881939 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_74b83c90-7b71-4ae1-99a1-9f4b3e559ee5/manage-dockerfile/0.log" Mar 10 00:29:27 crc kubenswrapper[4994]: I0310 00:29:27.835181 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5","Type":"ContainerStarted","Data":"1cec55edb935643ef06816de90b5acce8ec6dca68cf8395b56b9eae6f63e0b45"} Mar 10 00:29:27 crc kubenswrapper[4994]: I0310 00:29:27.877336 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=4.877306541 podStartE2EDuration="4.877306541s" podCreationTimestamp="2026-03-10 00:29:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:29:27.871293807 +0000 UTC m=+1382.045000586" watchObservedRunningTime="2026-03-10 00:29:27.877306541 +0000 UTC m=+1382.051013330" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.158847 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551710-vrwfw"] Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.160817 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551710-vrwfw" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.164370 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.164546 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.164746 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.171638 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh"] Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.172677 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.177413 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.177695 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.215819 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh"] Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.226646 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551710-vrwfw"] Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.259779 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzct8\" (UniqueName: \"kubernetes.io/projected/907fae93-d4a7-46e8-9fab-3c964fcb52ab-kube-api-access-wzct8\") pod \"auto-csr-approver-29551710-vrwfw\" (UID: \"907fae93-d4a7-46e8-9fab-3c964fcb52ab\") " pod="openshift-infra/auto-csr-approver-29551710-vrwfw" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.259849 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/938a13c5-9bbf-4720-95ae-9e56e9d1f085-config-volume\") pod \"collect-profiles-29551710-xlhkh\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.260021 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzpmj\" (UniqueName: \"kubernetes.io/projected/938a13c5-9bbf-4720-95ae-9e56e9d1f085-kube-api-access-tzpmj\") pod \"collect-profiles-29551710-xlhkh\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.260070 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/938a13c5-9bbf-4720-95ae-9e56e9d1f085-secret-volume\") pod \"collect-profiles-29551710-xlhkh\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.361074 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzpmj\" (UniqueName: \"kubernetes.io/projected/938a13c5-9bbf-4720-95ae-9e56e9d1f085-kube-api-access-tzpmj\") pod \"collect-profiles-29551710-xlhkh\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.361144 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/938a13c5-9bbf-4720-95ae-9e56e9d1f085-secret-volume\") pod \"collect-profiles-29551710-xlhkh\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.361216 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzct8\" (UniqueName: \"kubernetes.io/projected/907fae93-d4a7-46e8-9fab-3c964fcb52ab-kube-api-access-wzct8\") pod \"auto-csr-approver-29551710-vrwfw\" (UID: \"907fae93-d4a7-46e8-9fab-3c964fcb52ab\") " pod="openshift-infra/auto-csr-approver-29551710-vrwfw" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.361243 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/938a13c5-9bbf-4720-95ae-9e56e9d1f085-config-volume\") pod \"collect-profiles-29551710-xlhkh\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.362703 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/938a13c5-9bbf-4720-95ae-9e56e9d1f085-config-volume\") pod \"collect-profiles-29551710-xlhkh\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.380986 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/938a13c5-9bbf-4720-95ae-9e56e9d1f085-secret-volume\") pod \"collect-profiles-29551710-xlhkh\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.383924 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzct8\" (UniqueName: \"kubernetes.io/projected/907fae93-d4a7-46e8-9fab-3c964fcb52ab-kube-api-access-wzct8\") pod \"auto-csr-approver-29551710-vrwfw\" (UID: \"907fae93-d4a7-46e8-9fab-3c964fcb52ab\") " pod="openshift-infra/auto-csr-approver-29551710-vrwfw" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.389061 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzpmj\" (UniqueName: \"kubernetes.io/projected/938a13c5-9bbf-4720-95ae-9e56e9d1f085-kube-api-access-tzpmj\") pod \"collect-profiles-29551710-xlhkh\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.500516 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551710-vrwfw" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.508954 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.795851 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551710-vrwfw"] Mar 10 00:30:00 crc kubenswrapper[4994]: I0310 00:30:00.869849 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh"] Mar 10 00:30:00 crc kubenswrapper[4994]: W0310 00:30:00.874544 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod938a13c5_9bbf_4720_95ae_9e56e9d1f085.slice/crio-b53385df10843aab2ad8ee46166e5f7b08cd9244ce51883694fff2c4568ceffe WatchSource:0}: Error finding container b53385df10843aab2ad8ee46166e5f7b08cd9244ce51883694fff2c4568ceffe: Status 404 returned error can't find the container with id b53385df10843aab2ad8ee46166e5f7b08cd9244ce51883694fff2c4568ceffe Mar 10 00:30:01 crc kubenswrapper[4994]: I0310 00:30:01.127177 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551710-vrwfw" event={"ID":"907fae93-d4a7-46e8-9fab-3c964fcb52ab","Type":"ContainerStarted","Data":"4377cc881d06904302f5a9c8517a3998ed1077d10d3b6579c8372dc366e78cfb"} Mar 10 00:30:01 crc kubenswrapper[4994]: I0310 00:30:01.129244 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" event={"ID":"938a13c5-9bbf-4720-95ae-9e56e9d1f085","Type":"ContainerStarted","Data":"40ada2de530609925c6412ebabe6fcca0a00c2d47cc81793d57dac2812d96507"} Mar 10 00:30:01 crc kubenswrapper[4994]: I0310 00:30:01.129290 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" event={"ID":"938a13c5-9bbf-4720-95ae-9e56e9d1f085","Type":"ContainerStarted","Data":"b53385df10843aab2ad8ee46166e5f7b08cd9244ce51883694fff2c4568ceffe"} Mar 10 00:30:01 crc kubenswrapper[4994]: I0310 00:30:01.148836 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" podStartSLOduration=1.148807753 podStartE2EDuration="1.148807753s" podCreationTimestamp="2026-03-10 00:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:30:01.145166309 +0000 UTC m=+1415.318873118" watchObservedRunningTime="2026-03-10 00:30:01.148807753 +0000 UTC m=+1415.322514542" Mar 10 00:30:02 crc kubenswrapper[4994]: I0310 00:30:02.138623 4994 generic.go:334] "Generic (PLEG): container finished" podID="938a13c5-9bbf-4720-95ae-9e56e9d1f085" containerID="40ada2de530609925c6412ebabe6fcca0a00c2d47cc81793d57dac2812d96507" exitCode=0 Mar 10 00:30:02 crc kubenswrapper[4994]: I0310 00:30:02.138705 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" event={"ID":"938a13c5-9bbf-4720-95ae-9e56e9d1f085","Type":"ContainerDied","Data":"40ada2de530609925c6412ebabe6fcca0a00c2d47cc81793d57dac2812d96507"} Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.149617 4994 generic.go:334] "Generic (PLEG): container finished" podID="907fae93-d4a7-46e8-9fab-3c964fcb52ab" containerID="dc93dce81f7d66a840274eb6f49e057db6ba425ffe2f1cac85085352655d2af7" exitCode=0 Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.149726 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551710-vrwfw" event={"ID":"907fae93-d4a7-46e8-9fab-3c964fcb52ab","Type":"ContainerDied","Data":"dc93dce81f7d66a840274eb6f49e057db6ba425ffe2f1cac85085352655d2af7"} Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.504571 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.615108 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/938a13c5-9bbf-4720-95ae-9e56e9d1f085-secret-volume\") pod \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.615545 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/938a13c5-9bbf-4720-95ae-9e56e9d1f085-config-volume\") pod \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.615652 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzpmj\" (UniqueName: \"kubernetes.io/projected/938a13c5-9bbf-4720-95ae-9e56e9d1f085-kube-api-access-tzpmj\") pod \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\" (UID: \"938a13c5-9bbf-4720-95ae-9e56e9d1f085\") " Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.616928 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/938a13c5-9bbf-4720-95ae-9e56e9d1f085-config-volume" (OuterVolumeSpecName: "config-volume") pod "938a13c5-9bbf-4720-95ae-9e56e9d1f085" (UID: "938a13c5-9bbf-4720-95ae-9e56e9d1f085"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.636324 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/938a13c5-9bbf-4720-95ae-9e56e9d1f085-kube-api-access-tzpmj" (OuterVolumeSpecName: "kube-api-access-tzpmj") pod "938a13c5-9bbf-4720-95ae-9e56e9d1f085" (UID: "938a13c5-9bbf-4720-95ae-9e56e9d1f085"). InnerVolumeSpecName "kube-api-access-tzpmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.636386 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/938a13c5-9bbf-4720-95ae-9e56e9d1f085-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "938a13c5-9bbf-4720-95ae-9e56e9d1f085" (UID: "938a13c5-9bbf-4720-95ae-9e56e9d1f085"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.717330 4994 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/938a13c5-9bbf-4720-95ae-9e56e9d1f085-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.717386 4994 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/938a13c5-9bbf-4720-95ae-9e56e9d1f085-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:03 crc kubenswrapper[4994]: I0310 00:30:03.717405 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzpmj\" (UniqueName: \"kubernetes.io/projected/938a13c5-9bbf-4720-95ae-9e56e9d1f085-kube-api-access-tzpmj\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:04 crc kubenswrapper[4994]: I0310 00:30:04.165816 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" Mar 10 00:30:04 crc kubenswrapper[4994]: I0310 00:30:04.165854 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-xlhkh" event={"ID":"938a13c5-9bbf-4720-95ae-9e56e9d1f085","Type":"ContainerDied","Data":"b53385df10843aab2ad8ee46166e5f7b08cd9244ce51883694fff2c4568ceffe"} Mar 10 00:30:04 crc kubenswrapper[4994]: I0310 00:30:04.165956 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b53385df10843aab2ad8ee46166e5f7b08cd9244ce51883694fff2c4568ceffe" Mar 10 00:30:04 crc kubenswrapper[4994]: I0310 00:30:04.460448 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551710-vrwfw" Mar 10 00:30:04 crc kubenswrapper[4994]: I0310 00:30:04.528253 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzct8\" (UniqueName: \"kubernetes.io/projected/907fae93-d4a7-46e8-9fab-3c964fcb52ab-kube-api-access-wzct8\") pod \"907fae93-d4a7-46e8-9fab-3c964fcb52ab\" (UID: \"907fae93-d4a7-46e8-9fab-3c964fcb52ab\") " Mar 10 00:30:04 crc kubenswrapper[4994]: I0310 00:30:04.532843 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/907fae93-d4a7-46e8-9fab-3c964fcb52ab-kube-api-access-wzct8" (OuterVolumeSpecName: "kube-api-access-wzct8") pod "907fae93-d4a7-46e8-9fab-3c964fcb52ab" (UID: "907fae93-d4a7-46e8-9fab-3c964fcb52ab"). InnerVolumeSpecName "kube-api-access-wzct8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:30:04 crc kubenswrapper[4994]: I0310 00:30:04.629718 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzct8\" (UniqueName: \"kubernetes.io/projected/907fae93-d4a7-46e8-9fab-3c964fcb52ab-kube-api-access-wzct8\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:05 crc kubenswrapper[4994]: I0310 00:30:05.178047 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551710-vrwfw" event={"ID":"907fae93-d4a7-46e8-9fab-3c964fcb52ab","Type":"ContainerDied","Data":"4377cc881d06904302f5a9c8517a3998ed1077d10d3b6579c8372dc366e78cfb"} Mar 10 00:30:05 crc kubenswrapper[4994]: I0310 00:30:05.179202 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4377cc881d06904302f5a9c8517a3998ed1077d10d3b6579c8372dc366e78cfb" Mar 10 00:30:05 crc kubenswrapper[4994]: I0310 00:30:05.178125 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551710-vrwfw" Mar 10 00:30:05 crc kubenswrapper[4994]: I0310 00:30:05.526366 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551704-8n96k"] Mar 10 00:30:05 crc kubenswrapper[4994]: I0310 00:30:05.531915 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551704-8n96k"] Mar 10 00:30:06 crc kubenswrapper[4994]: I0310 00:30:06.562749 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca1779b4-8945-4667-b086-b7481edf1099" path="/var/lib/kubelet/pods/ca1779b4-8945-4667-b086-b7481edf1099/volumes" Mar 10 00:30:21 crc kubenswrapper[4994]: I0310 00:30:21.305392 4994 generic.go:334] "Generic (PLEG): container finished" podID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerID="1cec55edb935643ef06816de90b5acce8ec6dca68cf8395b56b9eae6f63e0b45" exitCode=0 Mar 10 00:30:21 crc kubenswrapper[4994]: I0310 00:30:21.305505 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5","Type":"ContainerDied","Data":"1cec55edb935643ef06816de90b5acce8ec6dca68cf8395b56b9eae6f63e0b45"} Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.088218 4994 scope.go:117] "RemoveContainer" containerID="05dbb86f2a8b07de1b18fc5d17c0892e037b64a657609562e3eb766699973201" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.648732 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792156 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-push\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792260 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-blob-cache\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792291 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-node-pullsecrets\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792328 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-ca-bundles\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792362 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-proxy-ca-bundles\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792392 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g8bl\" (UniqueName: \"kubernetes.io/projected/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-kube-api-access-7g8bl\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792431 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-root\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792460 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildcachedir\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792443 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792494 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-run\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792527 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-pull\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792571 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildworkdir\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792651 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-system-configs\") pod \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\" (UID: \"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5\") " Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.792838 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.793864 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.793926 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.794375 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.794550 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.795121 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.795965 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.798345 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.800659 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-kube-api-access-7g8bl" (OuterVolumeSpecName: "kube-api-access-7g8bl") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "kube-api-access-7g8bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.800957 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.801122 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.886320 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.894975 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.895024 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.895044 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.895068 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g8bl\" (UniqueName: \"kubernetes.io/projected/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-kube-api-access-7g8bl\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.895087 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.895107 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.895127 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.895145 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:22 crc kubenswrapper[4994]: I0310 00:30:22.895163 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:23 crc kubenswrapper[4994]: I0310 00:30:23.331359 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"74b83c90-7b71-4ae1-99a1-9f4b3e559ee5","Type":"ContainerDied","Data":"e73b4ce11c2c82beb195c98c6499aa1914f9f2f2e89ddf242434359b3b2d56d3"} Mar 10 00:30:23 crc kubenswrapper[4994]: I0310 00:30:23.331422 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e73b4ce11c2c82beb195c98c6499aa1914f9f2f2e89ddf242434359b3b2d56d3" Mar 10 00:30:23 crc kubenswrapper[4994]: I0310 00:30:23.331423 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:30:23 crc kubenswrapper[4994]: I0310 00:30:23.759763 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" (UID: "74b83c90-7b71-4ae1-99a1-9f4b3e559ee5"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:23 crc kubenswrapper[4994]: I0310 00:30:23.808299 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/74b83c90-7b71-4ae1-99a1-9f4b3e559ee5-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.983779 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 10 00:30:31 crc kubenswrapper[4994]: E0310 00:30:31.984588 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerName="docker-build" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.984604 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerName="docker-build" Mar 10 00:30:31 crc kubenswrapper[4994]: E0310 00:30:31.984628 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="907fae93-d4a7-46e8-9fab-3c964fcb52ab" containerName="oc" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.984636 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="907fae93-d4a7-46e8-9fab-3c964fcb52ab" containerName="oc" Mar 10 00:30:31 crc kubenswrapper[4994]: E0310 00:30:31.984649 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="938a13c5-9bbf-4720-95ae-9e56e9d1f085" containerName="collect-profiles" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.984656 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="938a13c5-9bbf-4720-95ae-9e56e9d1f085" containerName="collect-profiles" Mar 10 00:30:31 crc kubenswrapper[4994]: E0310 00:30:31.984667 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerName="manage-dockerfile" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.984673 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerName="manage-dockerfile" Mar 10 00:30:31 crc kubenswrapper[4994]: E0310 00:30:31.984689 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerName="git-clone" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.984696 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerName="git-clone" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.984831 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="938a13c5-9bbf-4720-95ae-9e56e9d1f085" containerName="collect-profiles" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.984849 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="907fae93-d4a7-46e8-9fab-3c964fcb52ab" containerName="oc" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.984858 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="74b83c90-7b71-4ae1-99a1-9f4b3e559ee5" containerName="docker-build" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.985567 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.988952 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-ca" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.989969 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.990231 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-sys-config" Mar 10 00:30:31 crc kubenswrapper[4994]: I0310 00:30:31.990979 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-global-ca" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.011522 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.027236 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.027312 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwcj4\" (UniqueName: \"kubernetes.io/projected/2f68d229-c995-41a3-b73b-171d31d81311-kube-api-access-wwcj4\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.027354 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.027388 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.027446 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.027506 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.027685 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.027811 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.027939 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.028006 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.028066 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.028111 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.129575 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.129655 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.129702 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.129745 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.129789 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.129842 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.129850 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.129932 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.129986 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130016 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwcj4\" (UniqueName: \"kubernetes.io/projected/2f68d229-c995-41a3-b73b-171d31d81311-kube-api-access-wwcj4\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130047 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130080 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130164 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130279 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130372 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130690 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130733 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130696 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.130777 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.131257 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.131764 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.136975 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.139139 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:32 crc kubenswrapper[4994]: I0310 00:30:32.149805 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwcj4\" (UniqueName: \"kubernetes.io/projected/2f68d229-c995-41a3-b73b-171d31d81311-kube-api-access-wwcj4\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:33 crc kubenswrapper[4994]: I0310 00:30:33.314583 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:33 crc kubenswrapper[4994]: I0310 00:30:33.795518 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 10 00:30:34 crc kubenswrapper[4994]: I0310 00:30:34.340145 4994 generic.go:334] "Generic (PLEG): container finished" podID="2f68d229-c995-41a3-b73b-171d31d81311" containerID="7225b87c2068896f2fce33d1aefcc4a4a471fea15131e40f56a1db24cbb94f3e" exitCode=0 Mar 10 00:30:34 crc kubenswrapper[4994]: I0310 00:30:34.340209 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"2f68d229-c995-41a3-b73b-171d31d81311","Type":"ContainerDied","Data":"7225b87c2068896f2fce33d1aefcc4a4a471fea15131e40f56a1db24cbb94f3e"} Mar 10 00:30:34 crc kubenswrapper[4994]: I0310 00:30:34.340499 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"2f68d229-c995-41a3-b73b-171d31d81311","Type":"ContainerStarted","Data":"b2c407df3480f0bdf28a09e039081c8f8ddbb1de11afdd6db04cccd8d962c75e"} Mar 10 00:30:35 crc kubenswrapper[4994]: I0310 00:30:35.351681 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_2f68d229-c995-41a3-b73b-171d31d81311/docker-build/0.log" Mar 10 00:30:35 crc kubenswrapper[4994]: I0310 00:30:35.354511 4994 generic.go:334] "Generic (PLEG): container finished" podID="2f68d229-c995-41a3-b73b-171d31d81311" containerID="efddc60aba5bd5a67702715a9ba5bd81e253ee925326f34bfc3e8b98fe80390e" exitCode=1 Mar 10 00:30:35 crc kubenswrapper[4994]: I0310 00:30:35.354580 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"2f68d229-c995-41a3-b73b-171d31d81311","Type":"ContainerDied","Data":"efddc60aba5bd5a67702715a9ba5bd81e253ee925326f34bfc3e8b98fe80390e"} Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.702630 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_2f68d229-c995-41a3-b73b-171d31d81311/docker-build/0.log" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.703290 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.796952 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-system-configs\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.797090 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-root\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.797141 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwcj4\" (UniqueName: \"kubernetes.io/projected/2f68d229-c995-41a3-b73b-171d31d81311-kube-api-access-wwcj4\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.797181 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-node-pullsecrets\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.797237 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-build-blob-cache\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.797290 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-buildworkdir\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.797361 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.797933 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.798096 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.798332 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-proxy-ca-bundles\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.798432 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.798468 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-run\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.798564 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-push\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.798596 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-ca-bundles\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.798692 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-buildcachedir\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.798720 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-pull\") pod \"2f68d229-c995-41a3-b73b-171d31d81311\" (UID: \"2f68d229-c995-41a3-b73b-171d31d81311\") " Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.798851 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.799085 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.799277 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.799343 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.799365 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2f68d229-c995-41a3-b73b-171d31d81311-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.799380 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.799392 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.799430 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.799755 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.801391 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.801670 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.802524 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.803592 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.805023 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f68d229-c995-41a3-b73b-171d31d81311-kube-api-access-wwcj4" (OuterVolumeSpecName: "kube-api-access-wwcj4") pod "2f68d229-c995-41a3-b73b-171d31d81311" (UID: "2f68d229-c995-41a3-b73b-171d31d81311"). InnerVolumeSpecName "kube-api-access-wwcj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.900641 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.900674 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwcj4\" (UniqueName: \"kubernetes.io/projected/2f68d229-c995-41a3-b73b-171d31d81311-kube-api-access-wwcj4\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.900682 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/2f68d229-c995-41a3-b73b-171d31d81311-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.900691 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.900699 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f68d229-c995-41a3-b73b-171d31d81311-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:36 crc kubenswrapper[4994]: I0310 00:30:36.900707 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/2f68d229-c995-41a3-b73b-171d31d81311-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:37 crc kubenswrapper[4994]: I0310 00:30:37.376481 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_2f68d229-c995-41a3-b73b-171d31d81311/docker-build/0.log" Mar 10 00:30:37 crc kubenswrapper[4994]: I0310 00:30:37.377157 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"2f68d229-c995-41a3-b73b-171d31d81311","Type":"ContainerDied","Data":"b2c407df3480f0bdf28a09e039081c8f8ddbb1de11afdd6db04cccd8d962c75e"} Mar 10 00:30:37 crc kubenswrapper[4994]: I0310 00:30:37.377208 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2c407df3480f0bdf28a09e039081c8f8ddbb1de11afdd6db04cccd8d962c75e" Mar 10 00:30:37 crc kubenswrapper[4994]: I0310 00:30:37.377315 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:42 crc kubenswrapper[4994]: I0310 00:30:42.483101 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 10 00:30:42 crc kubenswrapper[4994]: I0310 00:30:42.496346 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 10 00:30:42 crc kubenswrapper[4994]: I0310 00:30:42.566866 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f68d229-c995-41a3-b73b-171d31d81311" path="/var/lib/kubelet/pods/2f68d229-c995-41a3-b73b-171d31d81311/volumes" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.100469 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 10 00:30:44 crc kubenswrapper[4994]: E0310 00:30:44.100719 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f68d229-c995-41a3-b73b-171d31d81311" containerName="docker-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.100731 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f68d229-c995-41a3-b73b-171d31d81311" containerName="docker-build" Mar 10 00:30:44 crc kubenswrapper[4994]: E0310 00:30:44.100747 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f68d229-c995-41a3-b73b-171d31d81311" containerName="manage-dockerfile" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.100755 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f68d229-c995-41a3-b73b-171d31d81311" containerName="manage-dockerfile" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.100862 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f68d229-c995-41a3-b73b-171d31d81311" containerName="docker-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.104457 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.106769 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.106829 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-sys-config" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.110140 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-ca" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.110152 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-global-ca" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.127146 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.206530 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.206650 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.206803 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.206915 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mfl8\" (UniqueName: \"kubernetes.io/projected/7902abc7-2de3-4fde-ba8d-71694a115914-kube-api-access-6mfl8\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.206954 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.206988 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.207018 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.207166 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.207248 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.207283 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.207403 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.207495 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308617 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308693 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308722 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308770 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308801 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308830 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mfl8\" (UniqueName: \"kubernetes.io/projected/7902abc7-2de3-4fde-ba8d-71694a115914-kube-api-access-6mfl8\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308854 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308892 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308915 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308959 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.308996 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.309036 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.309123 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.310003 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.310073 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.310175 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.310190 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.310613 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.310676 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.311330 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.311957 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.314634 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.316280 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.339573 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mfl8\" (UniqueName: \"kubernetes.io/projected/7902abc7-2de3-4fde-ba8d-71694a115914-kube-api-access-6mfl8\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.424570 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:44 crc kubenswrapper[4994]: I0310 00:30:44.739854 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 10 00:30:45 crc kubenswrapper[4994]: I0310 00:30:45.467606 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"7902abc7-2de3-4fde-ba8d-71694a115914","Type":"ContainerStarted","Data":"6bdd8c21673cfca7f7ecd67cec43deb1c83c74ac59ed88ea65a343de1e5ef343"} Mar 10 00:30:45 crc kubenswrapper[4994]: I0310 00:30:45.468067 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"7902abc7-2de3-4fde-ba8d-71694a115914","Type":"ContainerStarted","Data":"742b6efdf8a46db211dfd77bb0b22ee1b2782176fb1fd4661b2d9ecbddff8536"} Mar 10 00:30:45 crc kubenswrapper[4994]: E0310 00:30:45.617656 4994 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.164:53200->38.102.83.164:37473: read tcp 38.102.83.164:53200->38.102.83.164:37473: read: connection reset by peer Mar 10 00:30:46 crc kubenswrapper[4994]: I0310 00:30:46.489897 4994 generic.go:334] "Generic (PLEG): container finished" podID="7902abc7-2de3-4fde-ba8d-71694a115914" containerID="6bdd8c21673cfca7f7ecd67cec43deb1c83c74ac59ed88ea65a343de1e5ef343" exitCode=0 Mar 10 00:30:46 crc kubenswrapper[4994]: I0310 00:30:46.489944 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"7902abc7-2de3-4fde-ba8d-71694a115914","Type":"ContainerDied","Data":"6bdd8c21673cfca7f7ecd67cec43deb1c83c74ac59ed88ea65a343de1e5ef343"} Mar 10 00:30:47 crc kubenswrapper[4994]: I0310 00:30:47.501621 4994 generic.go:334] "Generic (PLEG): container finished" podID="7902abc7-2de3-4fde-ba8d-71694a115914" containerID="f315dad252472048091cc7248e3f2692d6c35b04b3a27299375edace2561d87c" exitCode=0 Mar 10 00:30:47 crc kubenswrapper[4994]: I0310 00:30:47.501719 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"7902abc7-2de3-4fde-ba8d-71694a115914","Type":"ContainerDied","Data":"f315dad252472048091cc7248e3f2692d6c35b04b3a27299375edace2561d87c"} Mar 10 00:30:47 crc kubenswrapper[4994]: I0310 00:30:47.555754 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-2-build_7902abc7-2de3-4fde-ba8d-71694a115914/manage-dockerfile/0.log" Mar 10 00:30:48 crc kubenswrapper[4994]: I0310 00:30:48.515996 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"7902abc7-2de3-4fde-ba8d-71694a115914","Type":"ContainerStarted","Data":"8b19189a44163a87912f9861e6175641d65fd08147a2b1eecc6b0ca0ba642217"} Mar 10 00:30:48 crc kubenswrapper[4994]: I0310 00:30:48.556618 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-2-build" podStartSLOduration=4.556585839 podStartE2EDuration="4.556585839s" podCreationTimestamp="2026-03-10 00:30:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:30:48.55429375 +0000 UTC m=+1462.728000499" watchObservedRunningTime="2026-03-10 00:30:48.556585839 +0000 UTC m=+1462.730292628" Mar 10 00:30:53 crc kubenswrapper[4994]: I0310 00:30:53.562487 4994 generic.go:334] "Generic (PLEG): container finished" podID="7902abc7-2de3-4fde-ba8d-71694a115914" containerID="8b19189a44163a87912f9861e6175641d65fd08147a2b1eecc6b0ca0ba642217" exitCode=0 Mar 10 00:30:53 crc kubenswrapper[4994]: I0310 00:30:53.562544 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"7902abc7-2de3-4fde-ba8d-71694a115914","Type":"ContainerDied","Data":"8b19189a44163a87912f9861e6175641d65fd08147a2b1eecc6b0ca0ba642217"} Mar 10 00:30:54 crc kubenswrapper[4994]: I0310 00:30:54.935055 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.077560 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-run\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.077752 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-buildcachedir\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.077834 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mfl8\" (UniqueName: \"kubernetes.io/projected/7902abc7-2de3-4fde-ba8d-71694a115914-kube-api-access-6mfl8\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.077841 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.077981 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-buildworkdir\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.078073 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-ca-bundles\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.078184 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-push\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.078230 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-node-pullsecrets\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.078280 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-build-blob-cache\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.078327 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-system-configs\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.078376 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.078462 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-root\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.079001 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.079080 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.079392 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.079585 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.081977 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-pull\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.082066 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-proxy-ca-bundles\") pod \"7902abc7-2de3-4fde-ba8d-71694a115914\" (UID: \"7902abc7-2de3-4fde-ba8d-71694a115914\") " Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.082564 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.082588 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.082601 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.082616 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.082629 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7902abc7-2de3-4fde-ba8d-71694a115914-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.082641 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.083262 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.083516 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.084795 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.084925 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7902abc7-2de3-4fde-ba8d-71694a115914-kube-api-access-6mfl8" (OuterVolumeSpecName: "kube-api-access-6mfl8") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "kube-api-access-6mfl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.085262 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.093597 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "7902abc7-2de3-4fde-ba8d-71694a115914" (UID: "7902abc7-2de3-4fde-ba8d-71694a115914"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.184275 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.184315 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.184332 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7902abc7-2de3-4fde-ba8d-71694a115914-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.184344 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mfl8\" (UniqueName: \"kubernetes.io/projected/7902abc7-2de3-4fde-ba8d-71694a115914-kube-api-access-6mfl8\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.184357 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/7902abc7-2de3-4fde-ba8d-71694a115914-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.184371 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7902abc7-2de3-4fde-ba8d-71694a115914-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.583328 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"7902abc7-2de3-4fde-ba8d-71694a115914","Type":"ContainerDied","Data":"742b6efdf8a46db211dfd77bb0b22ee1b2782176fb1fd4661b2d9ecbddff8536"} Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.583390 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="742b6efdf8a46db211dfd77bb0b22ee1b2782176fb1fd4661b2d9ecbddff8536" Mar 10 00:30:55 crc kubenswrapper[4994]: I0310 00:30:55.583391 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.645103 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 10 00:30:58 crc kubenswrapper[4994]: E0310 00:30:58.646037 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7902abc7-2de3-4fde-ba8d-71694a115914" containerName="git-clone" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.646064 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="7902abc7-2de3-4fde-ba8d-71694a115914" containerName="git-clone" Mar 10 00:30:58 crc kubenswrapper[4994]: E0310 00:30:58.646104 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7902abc7-2de3-4fde-ba8d-71694a115914" containerName="docker-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.646115 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="7902abc7-2de3-4fde-ba8d-71694a115914" containerName="docker-build" Mar 10 00:30:58 crc kubenswrapper[4994]: E0310 00:30:58.646133 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7902abc7-2de3-4fde-ba8d-71694a115914" containerName="manage-dockerfile" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.646144 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="7902abc7-2de3-4fde-ba8d-71694a115914" containerName="manage-dockerfile" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.646340 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="7902abc7-2de3-4fde-ba8d-71694a115914" containerName="docker-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.647627 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.650456 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-sys-config" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.651246 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-global-ca" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.651446 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.651585 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-ca" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.671151 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841285 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841341 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841491 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841563 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841636 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbxgv\" (UniqueName: \"kubernetes.io/projected/a0f543ac-d85b-4776-9cae-5475e4a43318-kube-api-access-sbxgv\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841686 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841734 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841768 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841834 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841918 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.841982 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.842017 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943393 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943448 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943496 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943530 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943568 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943595 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943635 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbxgv\" (UniqueName: \"kubernetes.io/projected/a0f543ac-d85b-4776-9cae-5475e4a43318-kube-api-access-sbxgv\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943662 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943688 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943712 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943748 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943775 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.943983 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.944052 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.944759 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.944860 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.944991 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.945091 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.945353 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.945361 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.945930 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.954924 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.959285 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.974016 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbxgv\" (UniqueName: \"kubernetes.io/projected/a0f543ac-d85b-4776-9cae-5475e4a43318-kube-api-access-sbxgv\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:58 crc kubenswrapper[4994]: I0310 00:30:58.976917 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:59 crc kubenswrapper[4994]: I0310 00:30:59.491539 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 10 00:30:59 crc kubenswrapper[4994]: W0310 00:30:59.492525 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0f543ac_d85b_4776_9cae_5475e4a43318.slice/crio-07bc0c4b967ac20d07fc8fce99808770b0f8bd4f892bee865400735fd44e1313 WatchSource:0}: Error finding container 07bc0c4b967ac20d07fc8fce99808770b0f8bd4f892bee865400735fd44e1313: Status 404 returned error can't find the container with id 07bc0c4b967ac20d07fc8fce99808770b0f8bd4f892bee865400735fd44e1313 Mar 10 00:30:59 crc kubenswrapper[4994]: I0310 00:30:59.618210 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"a0f543ac-d85b-4776-9cae-5475e4a43318","Type":"ContainerStarted","Data":"07bc0c4b967ac20d07fc8fce99808770b0f8bd4f892bee865400735fd44e1313"} Mar 10 00:31:00 crc kubenswrapper[4994]: I0310 00:31:00.631063 4994 generic.go:334] "Generic (PLEG): container finished" podID="a0f543ac-d85b-4776-9cae-5475e4a43318" containerID="74761a9eff3f39fcf080fcf43da99f605fa6e3e4193c77e19940c640d07270e2" exitCode=0 Mar 10 00:31:00 crc kubenswrapper[4994]: I0310 00:31:00.631134 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"a0f543ac-d85b-4776-9cae-5475e4a43318","Type":"ContainerDied","Data":"74761a9eff3f39fcf080fcf43da99f605fa6e3e4193c77e19940c640d07270e2"} Mar 10 00:31:01 crc kubenswrapper[4994]: I0310 00:31:01.642789 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_a0f543ac-d85b-4776-9cae-5475e4a43318/docker-build/0.log" Mar 10 00:31:01 crc kubenswrapper[4994]: I0310 00:31:01.643924 4994 generic.go:334] "Generic (PLEG): container finished" podID="a0f543ac-d85b-4776-9cae-5475e4a43318" containerID="d932d5abafc702ac0d919613d1196b6f6540e0380b55b01cbf3b8f80be098bd1" exitCode=1 Mar 10 00:31:01 crc kubenswrapper[4994]: I0310 00:31:01.643987 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"a0f543ac-d85b-4776-9cae-5475e4a43318","Type":"ContainerDied","Data":"d932d5abafc702ac0d919613d1196b6f6540e0380b55b01cbf3b8f80be098bd1"} Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.018846 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_a0f543ac-d85b-4776-9cae-5475e4a43318/docker-build/0.log" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.020094 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.110637 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-buildcachedir\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.110794 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-push\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.110826 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.110868 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-ca-bundles\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111082 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-build-blob-cache\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111197 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-proxy-ca-bundles\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111353 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbxgv\" (UniqueName: \"kubernetes.io/projected/a0f543ac-d85b-4776-9cae-5475e4a43318-kube-api-access-sbxgv\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111438 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-pull\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111497 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-buildworkdir\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111526 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-node-pullsecrets\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111561 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-root\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111595 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-run\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111634 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-system-configs\") pod \"a0f543ac-d85b-4776-9cae-5475e4a43318\" (UID: \"a0f543ac-d85b-4776-9cae-5475e4a43318\") " Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111850 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111974 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.111967 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.112315 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.112831 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.113036 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.114327 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.115649 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.116347 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.118592 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.119229 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f543ac-d85b-4776-9cae-5475e4a43318-kube-api-access-sbxgv" (OuterVolumeSpecName: "kube-api-access-sbxgv") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "kube-api-access-sbxgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.119927 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "a0f543ac-d85b-4776-9cae-5475e4a43318" (UID: "a0f543ac-d85b-4776-9cae-5475e4a43318"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214497 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214553 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0f543ac-d85b-4776-9cae-5475e4a43318-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214575 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214595 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214613 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214631 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214649 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214665 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0f543ac-d85b-4776-9cae-5475e4a43318-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214681 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0f543ac-d85b-4776-9cae-5475e4a43318-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214698 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbxgv\" (UniqueName: \"kubernetes.io/projected/a0f543ac-d85b-4776-9cae-5475e4a43318-kube-api-access-sbxgv\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.214717 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/a0f543ac-d85b-4776-9cae-5475e4a43318-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.662928 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_a0f543ac-d85b-4776-9cae-5475e4a43318/docker-build/0.log" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.663671 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"a0f543ac-d85b-4776-9cae-5475e4a43318","Type":"ContainerDied","Data":"07bc0c4b967ac20d07fc8fce99808770b0f8bd4f892bee865400735fd44e1313"} Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.663725 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07bc0c4b967ac20d07fc8fce99808770b0f8bd4f892bee865400735fd44e1313" Mar 10 00:31:03 crc kubenswrapper[4994]: I0310 00:31:03.663744 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:31:09 crc kubenswrapper[4994]: I0310 00:31:09.120591 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 10 00:31:09 crc kubenswrapper[4994]: I0310 00:31:09.132099 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.566780 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f543ac-d85b-4776-9cae-5475e4a43318" path="/var/lib/kubelet/pods/a0f543ac-d85b-4776-9cae-5475e4a43318/volumes" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.769842 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 10 00:31:10 crc kubenswrapper[4994]: E0310 00:31:10.770331 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f543ac-d85b-4776-9cae-5475e4a43318" containerName="manage-dockerfile" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.770365 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f543ac-d85b-4776-9cae-5475e4a43318" containerName="manage-dockerfile" Mar 10 00:31:10 crc kubenswrapper[4994]: E0310 00:31:10.770404 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f543ac-d85b-4776-9cae-5475e4a43318" containerName="docker-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.770418 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f543ac-d85b-4776-9cae-5475e4a43318" containerName="docker-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.770655 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f543ac-d85b-4776-9cae-5475e4a43318" containerName="docker-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.772748 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.776005 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.776785 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-sys-config" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.777392 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.778844 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-global-ca" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.779149 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-ca" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.948722 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.948822 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.948971 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.949082 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.949139 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.949201 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.949274 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.949355 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.949428 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.949524 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.949653 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:10 crc kubenswrapper[4994]: I0310 00:31:10.949707 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-957nf\" (UniqueName: \"kubernetes.io/projected/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-kube-api-access-957nf\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.050580 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.050670 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.050724 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.050815 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.050865 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.050929 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.050976 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.050998 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.051028 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.051114 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.051160 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.051230 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.051280 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.051348 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-957nf\" (UniqueName: \"kubernetes.io/projected/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-kube-api-access-957nf\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.051432 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.052513 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.053073 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.053279 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.053800 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.053934 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.054281 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.061679 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.061711 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.085546 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-957nf\" (UniqueName: \"kubernetes.io/projected/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-kube-api-access-957nf\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.103162 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.433772 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 10 00:31:11 crc kubenswrapper[4994]: I0310 00:31:11.779358 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f","Type":"ContainerStarted","Data":"072eb920df43a19ae92b79127c1e5d8130581afedbc74bbcfd7fc4942ec7c1aa"} Mar 10 00:31:12 crc kubenswrapper[4994]: I0310 00:31:12.792731 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f","Type":"ContainerStarted","Data":"0c01f1deef18e844bb567b3557ae2178106db4c54958eedf7439e442df977495"} Mar 10 00:31:13 crc kubenswrapper[4994]: E0310 00:31:13.041642 4994 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.164:47462->38.102.83.164:37473: read tcp 38.102.83.164:47462->38.102.83.164:37473: read: connection reset by peer Mar 10 00:31:13 crc kubenswrapper[4994]: I0310 00:31:13.807621 4994 generic.go:334] "Generic (PLEG): container finished" podID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerID="0c01f1deef18e844bb567b3557ae2178106db4c54958eedf7439e442df977495" exitCode=0 Mar 10 00:31:13 crc kubenswrapper[4994]: I0310 00:31:13.807695 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f","Type":"ContainerDied","Data":"0c01f1deef18e844bb567b3557ae2178106db4c54958eedf7439e442df977495"} Mar 10 00:31:14 crc kubenswrapper[4994]: I0310 00:31:14.819075 4994 generic.go:334] "Generic (PLEG): container finished" podID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerID="3f34506bf5d0590830eeb3f08b1bed951f6f1560b97971ea45d24f66e9003fe6" exitCode=0 Mar 10 00:31:14 crc kubenswrapper[4994]: I0310 00:31:14.819146 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f","Type":"ContainerDied","Data":"3f34506bf5d0590830eeb3f08b1bed951f6f1560b97971ea45d24f66e9003fe6"} Mar 10 00:31:14 crc kubenswrapper[4994]: I0310 00:31:14.878669 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-2-build_3dee5c53-e9b2-4246-8d9a-30fa345c1f0f/manage-dockerfile/0.log" Mar 10 00:31:15 crc kubenswrapper[4994]: I0310 00:31:15.833252 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f","Type":"ContainerStarted","Data":"c8f445e7254180f6d08cfdcb52469d60bb6e8cb339809370ede142970d703bd9"} Mar 10 00:31:15 crc kubenswrapper[4994]: I0310 00:31:15.887281 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-bundle-2-build" podStartSLOduration=5.887228816 podStartE2EDuration="5.887228816s" podCreationTimestamp="2026-03-10 00:31:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:31:15.876522571 +0000 UTC m=+1490.050229410" watchObservedRunningTime="2026-03-10 00:31:15.887228816 +0000 UTC m=+1490.060935605" Mar 10 00:31:18 crc kubenswrapper[4994]: I0310 00:31:18.892530 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:31:18 crc kubenswrapper[4994]: I0310 00:31:18.893346 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:31:19 crc kubenswrapper[4994]: I0310 00:31:19.883729 4994 generic.go:334] "Generic (PLEG): container finished" podID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerID="c8f445e7254180f6d08cfdcb52469d60bb6e8cb339809370ede142970d703bd9" exitCode=0 Mar 10 00:31:19 crc kubenswrapper[4994]: I0310 00:31:19.883830 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f","Type":"ContainerDied","Data":"c8f445e7254180f6d08cfdcb52469d60bb6e8cb339809370ede142970d703bd9"} Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.246082 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368335 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-957nf\" (UniqueName: \"kubernetes.io/projected/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-kube-api-access-957nf\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368396 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-proxy-ca-bundles\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368422 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-run\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368486 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-system-configs\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368509 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-blob-cache\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368541 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-push\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368563 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildcachedir\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368582 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-node-pullsecrets\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368643 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-root\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368661 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildworkdir\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368686 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-pull\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.368718 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-ca-bundles\") pod \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\" (UID: \"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f\") " Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.369431 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.369471 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.369573 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.370696 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.372144 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.372277 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.373065 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.373808 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.376048 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-kube-api-access-957nf" (OuterVolumeSpecName: "kube-api-access-957nf") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "kube-api-access-957nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.376903 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.376864 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.384682 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" (UID: "3dee5c53-e9b2-4246-8d9a-30fa345c1f0f"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469764 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469804 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-957nf\" (UniqueName: \"kubernetes.io/projected/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-kube-api-access-957nf\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469819 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469830 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469842 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469852 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469906 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469920 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469930 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469941 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469953 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.469964 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/3dee5c53-e9b2-4246-8d9a-30fa345c1f0f-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.905034 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"3dee5c53-e9b2-4246-8d9a-30fa345c1f0f","Type":"ContainerDied","Data":"072eb920df43a19ae92b79127c1e5d8130581afedbc74bbcfd7fc4942ec7c1aa"} Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.905113 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="072eb920df43a19ae92b79127c1e5d8130581afedbc74bbcfd7fc4942ec7c1aa" Mar 10 00:31:21 crc kubenswrapper[4994]: I0310 00:31:21.905125 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.988787 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 10 00:31:37 crc kubenswrapper[4994]: E0310 00:31:37.989808 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerName="manage-dockerfile" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.989830 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerName="manage-dockerfile" Mar 10 00:31:37 crc kubenswrapper[4994]: E0310 00:31:37.989868 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerName="git-clone" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.989909 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerName="git-clone" Mar 10 00:31:37 crc kubenswrapper[4994]: E0310 00:31:37.989941 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerName="docker-build" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.989954 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerName="docker-build" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.990157 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dee5c53-e9b2-4246-8d9a-30fa345c1f0f" containerName="docker-build" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.991617 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.994317 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-sys-config" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.995729 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-ca" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.998136 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-global-ca" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.998418 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-r4pz7" Mar 10 00:31:37 crc kubenswrapper[4994]: I0310 00:31:37.999684 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-index-dockercfg" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.013192 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.121613 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.121688 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p44sn\" (UniqueName: \"kubernetes.io/projected/bc817ebb-7ab7-41d2-8961-92191d7749e9-kube-api-access-p44sn\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.121729 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.121861 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.122030 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.122121 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.122220 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.122327 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.122360 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.122375 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.122405 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.122444 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.122461 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224472 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224557 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224596 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224636 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224682 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224713 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224753 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224792 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224826 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p44sn\" (UniqueName: \"kubernetes.io/projected/bc817ebb-7ab7-41d2-8961-92191d7749e9-kube-api-access-p44sn\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224893 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224944 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.224997 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.225054 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.225364 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.225903 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.226041 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.226046 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.226124 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.226188 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.226713 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.227181 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.227182 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.235781 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.236179 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.241493 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.249391 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p44sn\" (UniqueName: \"kubernetes.io/projected/bc817ebb-7ab7-41d2-8961-92191d7749e9-kube-api-access-p44sn\") pod \"service-telemetry-framework-index-1-build\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.313594 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4994]: I0310 00:31:38.631399 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 10 00:31:39 crc kubenswrapper[4994]: I0310 00:31:39.087969 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bc817ebb-7ab7-41d2-8961-92191d7749e9","Type":"ContainerStarted","Data":"5982f0ac6e6cb6b4ab7545dbf91d9e7d9897958af8f6e1bcb23dd95fc5ffd82f"} Mar 10 00:31:39 crc kubenswrapper[4994]: I0310 00:31:39.088033 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bc817ebb-7ab7-41d2-8961-92191d7749e9","Type":"ContainerStarted","Data":"d460ca6197c6169b8175da1780d793119656b56b9e6b3e7442efa458df8fdfa3"} Mar 10 00:31:40 crc kubenswrapper[4994]: I0310 00:31:40.100036 4994 generic.go:334] "Generic (PLEG): container finished" podID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerID="5982f0ac6e6cb6b4ab7545dbf91d9e7d9897958af8f6e1bcb23dd95fc5ffd82f" exitCode=0 Mar 10 00:31:40 crc kubenswrapper[4994]: I0310 00:31:40.100137 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bc817ebb-7ab7-41d2-8961-92191d7749e9","Type":"ContainerDied","Data":"5982f0ac6e6cb6b4ab7545dbf91d9e7d9897958af8f6e1bcb23dd95fc5ffd82f"} Mar 10 00:31:41 crc kubenswrapper[4994]: I0310 00:31:41.110270 4994 generic.go:334] "Generic (PLEG): container finished" podID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerID="6706d407faeb67d398bcf4d8c3facdfecbf69671980d8bbc6f719c8f6a184918" exitCode=0 Mar 10 00:31:41 crc kubenswrapper[4994]: I0310 00:31:41.110322 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bc817ebb-7ab7-41d2-8961-92191d7749e9","Type":"ContainerDied","Data":"6706d407faeb67d398bcf4d8c3facdfecbf69671980d8bbc6f719c8f6a184918"} Mar 10 00:31:41 crc kubenswrapper[4994]: I0310 00:31:41.161863 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-1-build_bc817ebb-7ab7-41d2-8961-92191d7749e9/manage-dockerfile/0.log" Mar 10 00:31:42 crc kubenswrapper[4994]: I0310 00:31:42.126496 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bc817ebb-7ab7-41d2-8961-92191d7749e9","Type":"ContainerStarted","Data":"3d75614a57566feda03e9701c8258587fc4a6f9dcc314b90f72c6fbb29937a14"} Mar 10 00:31:42 crc kubenswrapper[4994]: I0310 00:31:42.179360 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-index-1-build" podStartSLOduration=5.179335128 podStartE2EDuration="5.179335128s" podCreationTimestamp="2026-03-10 00:31:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:31:42.172070453 +0000 UTC m=+1516.345777242" watchObservedRunningTime="2026-03-10 00:31:42.179335128 +0000 UTC m=+1516.353041917" Mar 10 00:31:48 crc kubenswrapper[4994]: I0310 00:31:48.892732 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:31:48 crc kubenswrapper[4994]: I0310 00:31:48.893420 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.148944 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551712-nx9pb"] Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.150351 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551712-nx9pb" Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.153201 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.153345 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.153437 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.169544 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551712-nx9pb"] Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.230583 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggdf8\" (UniqueName: \"kubernetes.io/projected/615394b2-0705-4358-853e-8c52eb448519-kube-api-access-ggdf8\") pod \"auto-csr-approver-29551712-nx9pb\" (UID: \"615394b2-0705-4358-853e-8c52eb448519\") " pod="openshift-infra/auto-csr-approver-29551712-nx9pb" Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.332954 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggdf8\" (UniqueName: \"kubernetes.io/projected/615394b2-0705-4358-853e-8c52eb448519-kube-api-access-ggdf8\") pod \"auto-csr-approver-29551712-nx9pb\" (UID: \"615394b2-0705-4358-853e-8c52eb448519\") " pod="openshift-infra/auto-csr-approver-29551712-nx9pb" Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.375608 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggdf8\" (UniqueName: \"kubernetes.io/projected/615394b2-0705-4358-853e-8c52eb448519-kube-api-access-ggdf8\") pod \"auto-csr-approver-29551712-nx9pb\" (UID: \"615394b2-0705-4358-853e-8c52eb448519\") " pod="openshift-infra/auto-csr-approver-29551712-nx9pb" Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.483956 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551712-nx9pb" Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.731668 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551712-nx9pb"] Mar 10 00:32:00 crc kubenswrapper[4994]: I0310 00:32:00.740979 4994 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 00:32:01 crc kubenswrapper[4994]: I0310 00:32:01.269993 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551712-nx9pb" event={"ID":"615394b2-0705-4358-853e-8c52eb448519","Type":"ContainerStarted","Data":"03e1ec631057a240591c7aee984d946412c033bd597ebc69609979acd12bee59"} Mar 10 00:32:03 crc kubenswrapper[4994]: I0310 00:32:03.289990 4994 generic.go:334] "Generic (PLEG): container finished" podID="615394b2-0705-4358-853e-8c52eb448519" containerID="ab12f6f7b139f927c15eec55fa9992338c6ae56c8336c6e012df890d87e1461b" exitCode=0 Mar 10 00:32:03 crc kubenswrapper[4994]: I0310 00:32:03.290088 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551712-nx9pb" event={"ID":"615394b2-0705-4358-853e-8c52eb448519","Type":"ContainerDied","Data":"ab12f6f7b139f927c15eec55fa9992338c6ae56c8336c6e012df890d87e1461b"} Mar 10 00:32:04 crc kubenswrapper[4994]: I0310 00:32:04.590139 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551712-nx9pb" Mar 10 00:32:04 crc kubenswrapper[4994]: I0310 00:32:04.696457 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggdf8\" (UniqueName: \"kubernetes.io/projected/615394b2-0705-4358-853e-8c52eb448519-kube-api-access-ggdf8\") pod \"615394b2-0705-4358-853e-8c52eb448519\" (UID: \"615394b2-0705-4358-853e-8c52eb448519\") " Mar 10 00:32:04 crc kubenswrapper[4994]: I0310 00:32:04.703933 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615394b2-0705-4358-853e-8c52eb448519-kube-api-access-ggdf8" (OuterVolumeSpecName: "kube-api-access-ggdf8") pod "615394b2-0705-4358-853e-8c52eb448519" (UID: "615394b2-0705-4358-853e-8c52eb448519"). InnerVolumeSpecName "kube-api-access-ggdf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:32:04 crc kubenswrapper[4994]: I0310 00:32:04.798119 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggdf8\" (UniqueName: \"kubernetes.io/projected/615394b2-0705-4358-853e-8c52eb448519-kube-api-access-ggdf8\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:05 crc kubenswrapper[4994]: I0310 00:32:05.312445 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551712-nx9pb" event={"ID":"615394b2-0705-4358-853e-8c52eb448519","Type":"ContainerDied","Data":"03e1ec631057a240591c7aee984d946412c033bd597ebc69609979acd12bee59"} Mar 10 00:32:05 crc kubenswrapper[4994]: I0310 00:32:05.312533 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03e1ec631057a240591c7aee984d946412c033bd597ebc69609979acd12bee59" Mar 10 00:32:05 crc kubenswrapper[4994]: I0310 00:32:05.312671 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551712-nx9pb" Mar 10 00:32:05 crc kubenswrapper[4994]: I0310 00:32:05.670359 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551706-trj4h"] Mar 10 00:32:05 crc kubenswrapper[4994]: I0310 00:32:05.679141 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551706-trj4h"] Mar 10 00:32:06 crc kubenswrapper[4994]: I0310 00:32:06.562924 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c045a416-3fda-4dc3-b95a-15be10565d84" path="/var/lib/kubelet/pods/c045a416-3fda-4dc3-b95a-15be10565d84/volumes" Mar 10 00:32:12 crc kubenswrapper[4994]: I0310 00:32:12.373561 4994 generic.go:334] "Generic (PLEG): container finished" podID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerID="3d75614a57566feda03e9701c8258587fc4a6f9dcc314b90f72c6fbb29937a14" exitCode=0 Mar 10 00:32:12 crc kubenswrapper[4994]: I0310 00:32:12.373782 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bc817ebb-7ab7-41d2-8961-92191d7749e9","Type":"ContainerDied","Data":"3d75614a57566feda03e9701c8258587fc4a6f9dcc314b90f72c6fbb29937a14"} Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.686773 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.830752 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-ca-bundles\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.830806 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildworkdir\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.830829 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-proxy-ca-bundles\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.830866 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-blob-cache\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.830913 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-push\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.830941 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-root\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.830965 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildcachedir\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.831000 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-run\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.831046 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-system-configs\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.831089 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.831116 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p44sn\" (UniqueName: \"kubernetes.io/projected/bc817ebb-7ab7-41d2-8961-92191d7749e9-kube-api-access-p44sn\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.831164 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-pull\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.831209 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-node-pullsecrets\") pod \"bc817ebb-7ab7-41d2-8961-92191d7749e9\" (UID: \"bc817ebb-7ab7-41d2-8961-92191d7749e9\") " Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.831463 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.831934 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.832195 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.832320 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.832734 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.833451 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.834127 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.840138 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-pull" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-pull") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "builder-dockercfg-r4pz7-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.842142 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.842396 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-push" (OuterVolumeSpecName: "builder-dockercfg-r4pz7-push") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "builder-dockercfg-r4pz7-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.843439 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc817ebb-7ab7-41d2-8961-92191d7749e9-kube-api-access-p44sn" (OuterVolumeSpecName: "kube-api-access-p44sn") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "kube-api-access-p44sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.932966 4994 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933005 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933019 4994 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933031 4994 reconciler_common.go:293] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933044 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p44sn\" (UniqueName: \"kubernetes.io/projected/bc817ebb-7ab7-41d2-8961-92191d7749e9-kube-api-access-p44sn\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933055 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-pull\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933066 4994 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bc817ebb-7ab7-41d2-8961-92191d7749e9-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933076 4994 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933090 4994 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933100 4994 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:13 crc kubenswrapper[4994]: I0310 00:32:13.933112 4994 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-r4pz7-push\" (UniqueName: \"kubernetes.io/secret/bc817ebb-7ab7-41d2-8961-92191d7749e9-builder-dockercfg-r4pz7-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:14 crc kubenswrapper[4994]: I0310 00:32:14.062426 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:14 crc kubenswrapper[4994]: I0310 00:32:14.136556 4994 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:14 crc kubenswrapper[4994]: I0310 00:32:14.394090 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"bc817ebb-7ab7-41d2-8961-92191d7749e9","Type":"ContainerDied","Data":"d460ca6197c6169b8175da1780d793119656b56b9e6b3e7442efa458df8fdfa3"} Mar 10 00:32:14 crc kubenswrapper[4994]: I0310 00:32:14.394760 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d460ca6197c6169b8175da1780d793119656b56b9e6b3e7442efa458df8fdfa3" Mar 10 00:32:14 crc kubenswrapper[4994]: I0310 00:32:14.394191 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.920439 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-sf9c4"] Mar 10 00:32:15 crc kubenswrapper[4994]: E0310 00:32:15.921286 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerName="git-clone" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.921312 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerName="git-clone" Mar 10 00:32:15 crc kubenswrapper[4994]: E0310 00:32:15.921339 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerName="docker-build" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.921352 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerName="docker-build" Mar 10 00:32:15 crc kubenswrapper[4994]: E0310 00:32:15.921377 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615394b2-0705-4358-853e-8c52eb448519" containerName="oc" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.921389 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="615394b2-0705-4358-853e-8c52eb448519" containerName="oc" Mar 10 00:32:15 crc kubenswrapper[4994]: E0310 00:32:15.921413 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerName="manage-dockerfile" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.921426 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerName="manage-dockerfile" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.921629 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="615394b2-0705-4358-853e-8c52eb448519" containerName="oc" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.921646 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc817ebb-7ab7-41d2-8961-92191d7749e9" containerName="docker-build" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.922381 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-sf9c4" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.929100 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"infrawatch-operators-dockercfg-nfz6x" Mar 10 00:32:15 crc kubenswrapper[4994]: I0310 00:32:15.957756 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-sf9c4"] Mar 10 00:32:16 crc kubenswrapper[4994]: I0310 00:32:16.084207 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d4jz\" (UniqueName: \"kubernetes.io/projected/01668f0d-50fe-449b-9fc8-2b949a68bb4e-kube-api-access-8d4jz\") pod \"infrawatch-operators-sf9c4\" (UID: \"01668f0d-50fe-449b-9fc8-2b949a68bb4e\") " pod="service-telemetry/infrawatch-operators-sf9c4" Mar 10 00:32:16 crc kubenswrapper[4994]: I0310 00:32:16.185479 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d4jz\" (UniqueName: \"kubernetes.io/projected/01668f0d-50fe-449b-9fc8-2b949a68bb4e-kube-api-access-8d4jz\") pod \"infrawatch-operators-sf9c4\" (UID: \"01668f0d-50fe-449b-9fc8-2b949a68bb4e\") " pod="service-telemetry/infrawatch-operators-sf9c4" Mar 10 00:32:16 crc kubenswrapper[4994]: I0310 00:32:16.208652 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d4jz\" (UniqueName: \"kubernetes.io/projected/01668f0d-50fe-449b-9fc8-2b949a68bb4e-kube-api-access-8d4jz\") pod \"infrawatch-operators-sf9c4\" (UID: \"01668f0d-50fe-449b-9fc8-2b949a68bb4e\") " pod="service-telemetry/infrawatch-operators-sf9c4" Mar 10 00:32:16 crc kubenswrapper[4994]: I0310 00:32:16.286369 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-sf9c4" Mar 10 00:32:16 crc kubenswrapper[4994]: I0310 00:32:16.388974 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "bc817ebb-7ab7-41d2-8961-92191d7749e9" (UID: "bc817ebb-7ab7-41d2-8961-92191d7749e9"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:16 crc kubenswrapper[4994]: I0310 00:32:16.487471 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-sf9c4"] Mar 10 00:32:16 crc kubenswrapper[4994]: I0310 00:32:16.490239 4994 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bc817ebb-7ab7-41d2-8961-92191d7749e9-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:17 crc kubenswrapper[4994]: I0310 00:32:17.416233 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-sf9c4" event={"ID":"01668f0d-50fe-449b-9fc8-2b949a68bb4e","Type":"ContainerStarted","Data":"ec33abc3badf44282af2cba3baa8e66b6f8f9dce90c50cf57dd91d9f331c1b26"} Mar 10 00:32:18 crc kubenswrapper[4994]: I0310 00:32:18.892197 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:32:18 crc kubenswrapper[4994]: I0310 00:32:18.892272 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:32:18 crc kubenswrapper[4994]: I0310 00:32:18.892330 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:32:18 crc kubenswrapper[4994]: I0310 00:32:18.893151 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d8d935625d60ec1fe79acd428aa0c427cb2a184ba8e0a37f25ea8bb9485e5629"} pod="openshift-machine-config-operator/machine-config-daemon-kfljj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:32:18 crc kubenswrapper[4994]: I0310 00:32:18.893239 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" containerID="cri-o://d8d935625d60ec1fe79acd428aa0c427cb2a184ba8e0a37f25ea8bb9485e5629" gracePeriod=600 Mar 10 00:32:19 crc kubenswrapper[4994]: I0310 00:32:19.431773 4994 generic.go:334] "Generic (PLEG): container finished" podID="ced5d66d-39df-4267-b801-e1e60d517ace" containerID="d8d935625d60ec1fe79acd428aa0c427cb2a184ba8e0a37f25ea8bb9485e5629" exitCode=0 Mar 10 00:32:19 crc kubenswrapper[4994]: I0310 00:32:19.432106 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerDied","Data":"d8d935625d60ec1fe79acd428aa0c427cb2a184ba8e0a37f25ea8bb9485e5629"} Mar 10 00:32:19 crc kubenswrapper[4994]: I0310 00:32:19.432251 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b"} Mar 10 00:32:19 crc kubenswrapper[4994]: I0310 00:32:19.432272 4994 scope.go:117] "RemoveContainer" containerID="779b11783e082c837efecec96b026cd3be87293636b7184dfd3efe1ae146c491" Mar 10 00:32:20 crc kubenswrapper[4994]: I0310 00:32:20.704677 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-sf9c4"] Mar 10 00:32:21 crc kubenswrapper[4994]: I0310 00:32:21.524070 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-hhvb6"] Mar 10 00:32:21 crc kubenswrapper[4994]: I0310 00:32:21.525481 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-hhvb6"] Mar 10 00:32:21 crc kubenswrapper[4994]: I0310 00:32:21.525605 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-hhvb6" Mar 10 00:32:21 crc kubenswrapper[4994]: I0310 00:32:21.662763 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh9hc\" (UniqueName: \"kubernetes.io/projected/ea63ae5e-58aa-4f18-b14b-514e618f4839-kube-api-access-zh9hc\") pod \"infrawatch-operators-hhvb6\" (UID: \"ea63ae5e-58aa-4f18-b14b-514e618f4839\") " pod="service-telemetry/infrawatch-operators-hhvb6" Mar 10 00:32:21 crc kubenswrapper[4994]: I0310 00:32:21.764195 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh9hc\" (UniqueName: \"kubernetes.io/projected/ea63ae5e-58aa-4f18-b14b-514e618f4839-kube-api-access-zh9hc\") pod \"infrawatch-operators-hhvb6\" (UID: \"ea63ae5e-58aa-4f18-b14b-514e618f4839\") " pod="service-telemetry/infrawatch-operators-hhvb6" Mar 10 00:32:21 crc kubenswrapper[4994]: I0310 00:32:21.782700 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh9hc\" (UniqueName: \"kubernetes.io/projected/ea63ae5e-58aa-4f18-b14b-514e618f4839-kube-api-access-zh9hc\") pod \"infrawatch-operators-hhvb6\" (UID: \"ea63ae5e-58aa-4f18-b14b-514e618f4839\") " pod="service-telemetry/infrawatch-operators-hhvb6" Mar 10 00:32:21 crc kubenswrapper[4994]: I0310 00:32:21.879428 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-hhvb6" Mar 10 00:32:22 crc kubenswrapper[4994]: I0310 00:32:22.225491 4994 scope.go:117] "RemoveContainer" containerID="4533fb5052fb646d2bdd6d148243818a0541d9f71db00a897459732577383e18" Mar 10 00:32:29 crc kubenswrapper[4994]: I0310 00:32:29.182451 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-hhvb6"] Mar 10 00:32:29 crc kubenswrapper[4994]: W0310 00:32:29.278852 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea63ae5e_58aa_4f18_b14b_514e618f4839.slice/crio-58e384788fdc7cd4b21d43b780461a6b0945556ce063d139f11eabf1041cba75 WatchSource:0}: Error finding container 58e384788fdc7cd4b21d43b780461a6b0945556ce063d139f11eabf1041cba75: Status 404 returned error can't find the container with id 58e384788fdc7cd4b21d43b780461a6b0945556ce063d139f11eabf1041cba75 Mar 10 00:32:29 crc kubenswrapper[4994]: I0310 00:32:29.517846 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-hhvb6" event={"ID":"ea63ae5e-58aa-4f18-b14b-514e618f4839","Type":"ContainerStarted","Data":"58e384788fdc7cd4b21d43b780461a6b0945556ce063d139f11eabf1041cba75"} Mar 10 00:32:30 crc kubenswrapper[4994]: I0310 00:32:30.527564 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-sf9c4" event={"ID":"01668f0d-50fe-449b-9fc8-2b949a68bb4e","Type":"ContainerStarted","Data":"9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6"} Mar 10 00:32:30 crc kubenswrapper[4994]: I0310 00:32:30.527691 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-sf9c4" podUID="01668f0d-50fe-449b-9fc8-2b949a68bb4e" containerName="registry-server" containerID="cri-o://9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6" gracePeriod=2 Mar 10 00:32:30 crc kubenswrapper[4994]: I0310 00:32:30.530029 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-hhvb6" event={"ID":"ea63ae5e-58aa-4f18-b14b-514e618f4839","Type":"ContainerStarted","Data":"d951629f67a54bec89647252d09794985e055265472771262c3e440dc915a918"} Mar 10 00:32:30 crc kubenswrapper[4994]: I0310 00:32:30.547854 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-sf9c4" podStartSLOduration=2.56385 podStartE2EDuration="15.547831781s" podCreationTimestamp="2026-03-10 00:32:15 +0000 UTC" firstStartedPulling="2026-03-10 00:32:16.495663307 +0000 UTC m=+1550.669370056" lastFinishedPulling="2026-03-10 00:32:29.479645058 +0000 UTC m=+1563.653351837" observedRunningTime="2026-03-10 00:32:30.546135657 +0000 UTC m=+1564.719842406" watchObservedRunningTime="2026-03-10 00:32:30.547831781 +0000 UTC m=+1564.721538530" Mar 10 00:32:30 crc kubenswrapper[4994]: I0310 00:32:30.572897 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-hhvb6" podStartSLOduration=9.425579737 podStartE2EDuration="9.572867561s" podCreationTimestamp="2026-03-10 00:32:21 +0000 UTC" firstStartedPulling="2026-03-10 00:32:29.282447939 +0000 UTC m=+1563.456154708" lastFinishedPulling="2026-03-10 00:32:29.429735773 +0000 UTC m=+1563.603442532" observedRunningTime="2026-03-10 00:32:30.56776765 +0000 UTC m=+1564.741474399" watchObservedRunningTime="2026-03-10 00:32:30.572867561 +0000 UTC m=+1564.746574310" Mar 10 00:32:30 crc kubenswrapper[4994]: I0310 00:32:30.943945 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-sf9c4" Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.006945 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d4jz\" (UniqueName: \"kubernetes.io/projected/01668f0d-50fe-449b-9fc8-2b949a68bb4e-kube-api-access-8d4jz\") pod \"01668f0d-50fe-449b-9fc8-2b949a68bb4e\" (UID: \"01668f0d-50fe-449b-9fc8-2b949a68bb4e\") " Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.013490 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01668f0d-50fe-449b-9fc8-2b949a68bb4e-kube-api-access-8d4jz" (OuterVolumeSpecName: "kube-api-access-8d4jz") pod "01668f0d-50fe-449b-9fc8-2b949a68bb4e" (UID: "01668f0d-50fe-449b-9fc8-2b949a68bb4e"). InnerVolumeSpecName "kube-api-access-8d4jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.108541 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d4jz\" (UniqueName: \"kubernetes.io/projected/01668f0d-50fe-449b-9fc8-2b949a68bb4e-kube-api-access-8d4jz\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.544588 4994 generic.go:334] "Generic (PLEG): container finished" podID="01668f0d-50fe-449b-9fc8-2b949a68bb4e" containerID="9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6" exitCode=0 Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.546194 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-sf9c4" Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.547557 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-sf9c4" event={"ID":"01668f0d-50fe-449b-9fc8-2b949a68bb4e","Type":"ContainerDied","Data":"9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6"} Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.547629 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-sf9c4" event={"ID":"01668f0d-50fe-449b-9fc8-2b949a68bb4e","Type":"ContainerDied","Data":"ec33abc3badf44282af2cba3baa8e66b6f8f9dce90c50cf57dd91d9f331c1b26"} Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.547660 4994 scope.go:117] "RemoveContainer" containerID="9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6" Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.612296 4994 scope.go:117] "RemoveContainer" containerID="9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6" Mar 10 00:32:31 crc kubenswrapper[4994]: E0310 00:32:31.614434 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6\": container with ID starting with 9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6 not found: ID does not exist" containerID="9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6" Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.614493 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6"} err="failed to get container status \"9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6\": rpc error: code = NotFound desc = could not find container \"9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6\": container with ID starting with 9dc82142e37c4f2926fd7fd1285bfa0ae6e8288234aac77d7e37871c6cbb29c6 not found: ID does not exist" Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.632972 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-sf9c4"] Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.643198 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-sf9c4"] Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.880518 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-hhvb6" Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.880967 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-hhvb6" Mar 10 00:32:31 crc kubenswrapper[4994]: I0310 00:32:31.923805 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-hhvb6" Mar 10 00:32:32 crc kubenswrapper[4994]: I0310 00:32:32.566048 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01668f0d-50fe-449b-9fc8-2b949a68bb4e" path="/var/lib/kubelet/pods/01668f0d-50fe-449b-9fc8-2b949a68bb4e/volumes" Mar 10 00:32:41 crc kubenswrapper[4994]: I0310 00:32:41.934949 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-hhvb6" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.174842 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99"] Mar 10 00:32:44 crc kubenswrapper[4994]: E0310 00:32:44.176105 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01668f0d-50fe-449b-9fc8-2b949a68bb4e" containerName="registry-server" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.176201 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="01668f0d-50fe-449b-9fc8-2b949a68bb4e" containerName="registry-server" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.176440 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="01668f0d-50fe-449b-9fc8-2b949a68bb4e" containerName="registry-server" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.177536 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.194978 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99"] Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.298569 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.298649 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.299048 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6vjf\" (UniqueName: \"kubernetes.io/projected/c127337a-56e6-4642-b020-920e566abbd8-kube-api-access-q6vjf\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.400710 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.400799 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.400979 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6vjf\" (UniqueName: \"kubernetes.io/projected/c127337a-56e6-4642-b020-920e566abbd8-kube-api-access-q6vjf\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.402102 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.402204 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.452543 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6vjf\" (UniqueName: \"kubernetes.io/projected/c127337a-56e6-4642-b020-920e566abbd8-kube-api-access-q6vjf\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.513782 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.848298 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99"] Mar 10 00:32:44 crc kubenswrapper[4994]: W0310 00:32:44.852153 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc127337a_56e6_4642_b020_920e566abbd8.slice/crio-8baf587def5b5252d7f47162fd8fbc2fc7e54f6a059e943333d7cd9fd8dfe3f3 WatchSource:0}: Error finding container 8baf587def5b5252d7f47162fd8fbc2fc7e54f6a059e943333d7cd9fd8dfe3f3: Status 404 returned error can't find the container with id 8baf587def5b5252d7f47162fd8fbc2fc7e54f6a059e943333d7cd9fd8dfe3f3 Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.976814 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7"] Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.978902 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:44 crc kubenswrapper[4994]: I0310 00:32:44.992317 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7"] Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.010547 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.010596 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j48cc\" (UniqueName: \"kubernetes.io/projected/8d5571bf-db5f-44e5-90a1-498f2f969ca8-kube-api-access-j48cc\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.010642 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.111558 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.111711 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.111774 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j48cc\" (UniqueName: \"kubernetes.io/projected/8d5571bf-db5f-44e5-90a1-498f2f969ca8-kube-api-access-j48cc\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.112654 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.112746 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.136603 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j48cc\" (UniqueName: \"kubernetes.io/projected/8d5571bf-db5f-44e5-90a1-498f2f969ca8-kube-api-access-j48cc\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.310749 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.706070 4994 generic.go:334] "Generic (PLEG): container finished" podID="c127337a-56e6-4642-b020-920e566abbd8" containerID="e78f149f3ee66141e736adb1d0dc9a4c936fa8f53c768cb49326775840a68b0e" exitCode=0 Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.706210 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" event={"ID":"c127337a-56e6-4642-b020-920e566abbd8","Type":"ContainerDied","Data":"e78f149f3ee66141e736adb1d0dc9a4c936fa8f53c768cb49326775840a68b0e"} Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.706561 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" event={"ID":"c127337a-56e6-4642-b020-920e566abbd8","Type":"ContainerStarted","Data":"8baf587def5b5252d7f47162fd8fbc2fc7e54f6a059e943333d7cd9fd8dfe3f3"} Mar 10 00:32:45 crc kubenswrapper[4994]: I0310 00:32:45.837682 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7"] Mar 10 00:32:45 crc kubenswrapper[4994]: W0310 00:32:45.838952 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d5571bf_db5f_44e5_90a1_498f2f969ca8.slice/crio-86e7a6bfb00006ef9c88c38d656ae4dd22b5e7518ad52ea26c2f316565df2f46 WatchSource:0}: Error finding container 86e7a6bfb00006ef9c88c38d656ae4dd22b5e7518ad52ea26c2f316565df2f46: Status 404 returned error can't find the container with id 86e7a6bfb00006ef9c88c38d656ae4dd22b5e7518ad52ea26c2f316565df2f46 Mar 10 00:32:46 crc kubenswrapper[4994]: I0310 00:32:46.718304 4994 generic.go:334] "Generic (PLEG): container finished" podID="c127337a-56e6-4642-b020-920e566abbd8" containerID="eb83498b13dc98bff97d5a0400a7387ef755246e02a753234d0ab169f8b47795" exitCode=0 Mar 10 00:32:46 crc kubenswrapper[4994]: I0310 00:32:46.718392 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" event={"ID":"c127337a-56e6-4642-b020-920e566abbd8","Type":"ContainerDied","Data":"eb83498b13dc98bff97d5a0400a7387ef755246e02a753234d0ab169f8b47795"} Mar 10 00:32:46 crc kubenswrapper[4994]: I0310 00:32:46.723213 4994 generic.go:334] "Generic (PLEG): container finished" podID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerID="4d07a4d08afb8e9202cbc788aac3bbf88b51a45d2b4a34125ca3aa3c061a0f90" exitCode=0 Mar 10 00:32:46 crc kubenswrapper[4994]: I0310 00:32:46.723264 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" event={"ID":"8d5571bf-db5f-44e5-90a1-498f2f969ca8","Type":"ContainerDied","Data":"4d07a4d08afb8e9202cbc788aac3bbf88b51a45d2b4a34125ca3aa3c061a0f90"} Mar 10 00:32:46 crc kubenswrapper[4994]: I0310 00:32:46.723303 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" event={"ID":"8d5571bf-db5f-44e5-90a1-498f2f969ca8","Type":"ContainerStarted","Data":"86e7a6bfb00006ef9c88c38d656ae4dd22b5e7518ad52ea26c2f316565df2f46"} Mar 10 00:32:47 crc kubenswrapper[4994]: I0310 00:32:47.738842 4994 generic.go:334] "Generic (PLEG): container finished" podID="c127337a-56e6-4642-b020-920e566abbd8" containerID="8dd12520e968c32892f2a71390fb9c5d48b8062b13352b7e632f3e5e28318421" exitCode=0 Mar 10 00:32:47 crc kubenswrapper[4994]: I0310 00:32:47.738938 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" event={"ID":"c127337a-56e6-4642-b020-920e566abbd8","Type":"ContainerDied","Data":"8dd12520e968c32892f2a71390fb9c5d48b8062b13352b7e632f3e5e28318421"} Mar 10 00:32:47 crc kubenswrapper[4994]: I0310 00:32:47.742936 4994 generic.go:334] "Generic (PLEG): container finished" podID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerID="6ab6fc48ccef71475117dba5680a4efdfd25a0886352f7b62443696caf5ea0ac" exitCode=0 Mar 10 00:32:47 crc kubenswrapper[4994]: I0310 00:32:47.743010 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" event={"ID":"8d5571bf-db5f-44e5-90a1-498f2f969ca8","Type":"ContainerDied","Data":"6ab6fc48ccef71475117dba5680a4efdfd25a0886352f7b62443696caf5ea0ac"} Mar 10 00:32:48 crc kubenswrapper[4994]: I0310 00:32:48.753678 4994 generic.go:334] "Generic (PLEG): container finished" podID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerID="3f89fc695b0342472abe2241f0c45238e106e6bb5e75eded193907abdb1b81a1" exitCode=0 Mar 10 00:32:48 crc kubenswrapper[4994]: I0310 00:32:48.753779 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" event={"ID":"8d5571bf-db5f-44e5-90a1-498f2f969ca8","Type":"ContainerDied","Data":"3f89fc695b0342472abe2241f0c45238e106e6bb5e75eded193907abdb1b81a1"} Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.050077 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.066850 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-bundle\") pod \"c127337a-56e6-4642-b020-920e566abbd8\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.067084 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6vjf\" (UniqueName: \"kubernetes.io/projected/c127337a-56e6-4642-b020-920e566abbd8-kube-api-access-q6vjf\") pod \"c127337a-56e6-4642-b020-920e566abbd8\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.067324 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-util\") pod \"c127337a-56e6-4642-b020-920e566abbd8\" (UID: \"c127337a-56e6-4642-b020-920e566abbd8\") " Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.067950 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-bundle" (OuterVolumeSpecName: "bundle") pod "c127337a-56e6-4642-b020-920e566abbd8" (UID: "c127337a-56e6-4642-b020-920e566abbd8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.079127 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c127337a-56e6-4642-b020-920e566abbd8-kube-api-access-q6vjf" (OuterVolumeSpecName: "kube-api-access-q6vjf") pod "c127337a-56e6-4642-b020-920e566abbd8" (UID: "c127337a-56e6-4642-b020-920e566abbd8"). InnerVolumeSpecName "kube-api-access-q6vjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.091543 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-util" (OuterVolumeSpecName: "util") pod "c127337a-56e6-4642-b020-920e566abbd8" (UID: "c127337a-56e6-4642-b020-920e566abbd8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.169055 4994 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-util\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.169100 4994 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c127337a-56e6-4642-b020-920e566abbd8-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.169119 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6vjf\" (UniqueName: \"kubernetes.io/projected/c127337a-56e6-4642-b020-920e566abbd8-kube-api-access-q6vjf\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.766024 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.766013 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09htp99" event={"ID":"c127337a-56e6-4642-b020-920e566abbd8","Type":"ContainerDied","Data":"8baf587def5b5252d7f47162fd8fbc2fc7e54f6a059e943333d7cd9fd8dfe3f3"} Mar 10 00:32:49 crc kubenswrapper[4994]: I0310 00:32:49.766110 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8baf587def5b5252d7f47162fd8fbc2fc7e54f6a059e943333d7cd9fd8dfe3f3" Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.104683 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.181821 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j48cc\" (UniqueName: \"kubernetes.io/projected/8d5571bf-db5f-44e5-90a1-498f2f969ca8-kube-api-access-j48cc\") pod \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.181943 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-bundle\") pod \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.181998 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-util\") pod \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\" (UID: \"8d5571bf-db5f-44e5-90a1-498f2f969ca8\") " Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.183071 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-bundle" (OuterVolumeSpecName: "bundle") pod "8d5571bf-db5f-44e5-90a1-498f2f969ca8" (UID: "8d5571bf-db5f-44e5-90a1-498f2f969ca8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.183480 4994 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.191274 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d5571bf-db5f-44e5-90a1-498f2f969ca8-kube-api-access-j48cc" (OuterVolumeSpecName: "kube-api-access-j48cc") pod "8d5571bf-db5f-44e5-90a1-498f2f969ca8" (UID: "8d5571bf-db5f-44e5-90a1-498f2f969ca8"). InnerVolumeSpecName "kube-api-access-j48cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.200519 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-util" (OuterVolumeSpecName: "util") pod "8d5571bf-db5f-44e5-90a1-498f2f969ca8" (UID: "8d5571bf-db5f-44e5-90a1-498f2f969ca8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.284855 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j48cc\" (UniqueName: \"kubernetes.io/projected/8d5571bf-db5f-44e5-90a1-498f2f969ca8-kube-api-access-j48cc\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.284923 4994 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d5571bf-db5f-44e5-90a1-498f2f969ca8-util\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.797446 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.798613 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ac94x7" event={"ID":"8d5571bf-db5f-44e5-90a1-498f2f969ca8","Type":"ContainerDied","Data":"86e7a6bfb00006ef9c88c38d656ae4dd22b5e7518ad52ea26c2f316565df2f46"} Mar 10 00:32:50 crc kubenswrapper[4994]: I0310 00:32:50.798649 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86e7a6bfb00006ef9c88c38d656ae4dd22b5e7518ad52ea26c2f316565df2f46" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.470931 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-656df8f446-7rqn6"] Mar 10 00:32:57 crc kubenswrapper[4994]: E0310 00:32:57.471783 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerName="extract" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.471798 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerName="extract" Mar 10 00:32:57 crc kubenswrapper[4994]: E0310 00:32:57.471810 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerName="pull" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.471819 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerName="pull" Mar 10 00:32:57 crc kubenswrapper[4994]: E0310 00:32:57.471830 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerName="util" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.471839 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerName="util" Mar 10 00:32:57 crc kubenswrapper[4994]: E0310 00:32:57.471849 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c127337a-56e6-4642-b020-920e566abbd8" containerName="extract" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.471857 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="c127337a-56e6-4642-b020-920e566abbd8" containerName="extract" Mar 10 00:32:57 crc kubenswrapper[4994]: E0310 00:32:57.471910 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c127337a-56e6-4642-b020-920e566abbd8" containerName="pull" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.471918 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="c127337a-56e6-4642-b020-920e566abbd8" containerName="pull" Mar 10 00:32:57 crc kubenswrapper[4994]: E0310 00:32:57.471931 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c127337a-56e6-4642-b020-920e566abbd8" containerName="util" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.471939 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="c127337a-56e6-4642-b020-920e566abbd8" containerName="util" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.472120 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="c127337a-56e6-4642-b020-920e566abbd8" containerName="extract" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.472143 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d5571bf-db5f-44e5-90a1-498f2f969ca8" containerName="extract" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.472685 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.477457 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-rvb57" Mar 10 00:32:57 crc kubenswrapper[4994]: I0310 00:32:57.500320 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-656df8f446-7rqn6"] Mar 10 00:32:58 crc kubenswrapper[4994]: I0310 00:32:58.233960 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/134b5ce4-37cf-459f-9c27-dafae8eb9e86-runner\") pod \"service-telemetry-operator-656df8f446-7rqn6\" (UID: \"134b5ce4-37cf-459f-9c27-dafae8eb9e86\") " pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" Mar 10 00:32:58 crc kubenswrapper[4994]: I0310 00:32:58.234041 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp66c\" (UniqueName: \"kubernetes.io/projected/134b5ce4-37cf-459f-9c27-dafae8eb9e86-kube-api-access-qp66c\") pod \"service-telemetry-operator-656df8f446-7rqn6\" (UID: \"134b5ce4-37cf-459f-9c27-dafae8eb9e86\") " pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" Mar 10 00:32:58 crc kubenswrapper[4994]: I0310 00:32:58.336411 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/134b5ce4-37cf-459f-9c27-dafae8eb9e86-runner\") pod \"service-telemetry-operator-656df8f446-7rqn6\" (UID: \"134b5ce4-37cf-459f-9c27-dafae8eb9e86\") " pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" Mar 10 00:32:58 crc kubenswrapper[4994]: I0310 00:32:58.336516 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp66c\" (UniqueName: \"kubernetes.io/projected/134b5ce4-37cf-459f-9c27-dafae8eb9e86-kube-api-access-qp66c\") pod \"service-telemetry-operator-656df8f446-7rqn6\" (UID: \"134b5ce4-37cf-459f-9c27-dafae8eb9e86\") " pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" Mar 10 00:32:58 crc kubenswrapper[4994]: I0310 00:32:58.337891 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/134b5ce4-37cf-459f-9c27-dafae8eb9e86-runner\") pod \"service-telemetry-operator-656df8f446-7rqn6\" (UID: \"134b5ce4-37cf-459f-9c27-dafae8eb9e86\") " pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" Mar 10 00:32:58 crc kubenswrapper[4994]: I0310 00:32:58.366815 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp66c\" (UniqueName: \"kubernetes.io/projected/134b5ce4-37cf-459f-9c27-dafae8eb9e86-kube-api-access-qp66c\") pod \"service-telemetry-operator-656df8f446-7rqn6\" (UID: \"134b5ce4-37cf-459f-9c27-dafae8eb9e86\") " pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" Mar 10 00:32:58 crc kubenswrapper[4994]: I0310 00:32:58.390534 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" Mar 10 00:32:58 crc kubenswrapper[4994]: I0310 00:32:58.893426 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-656df8f446-7rqn6"] Mar 10 00:32:59 crc kubenswrapper[4994]: I0310 00:32:59.281537 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" event={"ID":"134b5ce4-37cf-459f-9c27-dafae8eb9e86","Type":"ContainerStarted","Data":"940bffb00a849370e8dfc93b5d044295c971e9d27843d9549585d5e848cb7a5c"} Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.090448 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-7457956966-kbwlx"] Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.091491 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.097797 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-svrw7" Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.114675 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-7457956966-kbwlx"] Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.162047 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/0730e042-f632-4db2-a694-b5917982d77d-runner\") pod \"smart-gateway-operator-7457956966-kbwlx\" (UID: \"0730e042-f632-4db2-a694-b5917982d77d\") " pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.162113 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cck8\" (UniqueName: \"kubernetes.io/projected/0730e042-f632-4db2-a694-b5917982d77d-kube-api-access-8cck8\") pod \"smart-gateway-operator-7457956966-kbwlx\" (UID: \"0730e042-f632-4db2-a694-b5917982d77d\") " pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.263147 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/0730e042-f632-4db2-a694-b5917982d77d-runner\") pod \"smart-gateway-operator-7457956966-kbwlx\" (UID: \"0730e042-f632-4db2-a694-b5917982d77d\") " pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.263208 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cck8\" (UniqueName: \"kubernetes.io/projected/0730e042-f632-4db2-a694-b5917982d77d-kube-api-access-8cck8\") pod \"smart-gateway-operator-7457956966-kbwlx\" (UID: \"0730e042-f632-4db2-a694-b5917982d77d\") " pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.263590 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/0730e042-f632-4db2-a694-b5917982d77d-runner\") pod \"smart-gateway-operator-7457956966-kbwlx\" (UID: \"0730e042-f632-4db2-a694-b5917982d77d\") " pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.289415 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cck8\" (UniqueName: \"kubernetes.io/projected/0730e042-f632-4db2-a694-b5917982d77d-kube-api-access-8cck8\") pod \"smart-gateway-operator-7457956966-kbwlx\" (UID: \"0730e042-f632-4db2-a694-b5917982d77d\") " pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.413357 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" Mar 10 00:33:00 crc kubenswrapper[4994]: I0310 00:33:00.907172 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-7457956966-kbwlx"] Mar 10 00:33:01 crc kubenswrapper[4994]: I0310 00:33:01.300482 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" event={"ID":"0730e042-f632-4db2-a694-b5917982d77d","Type":"ContainerStarted","Data":"df25a59175fadab10f6d5b5817cbff5686da84c6460cba031ba25774701f5646"} Mar 10 00:33:19 crc kubenswrapper[4994]: I0310 00:33:19.474172 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" event={"ID":"134b5ce4-37cf-459f-9c27-dafae8eb9e86","Type":"ContainerStarted","Data":"e70afd2fb89cca0d4de8bfa3e1e955a780443e3c84f94c175ab446950c617f0d"} Mar 10 00:33:19 crc kubenswrapper[4994]: I0310 00:33:19.501528 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-656df8f446-7rqn6" podStartSLOduration=2.144236077 podStartE2EDuration="22.501508987s" podCreationTimestamp="2026-03-10 00:32:57 +0000 UTC" firstStartedPulling="2026-03-10 00:32:58.914665802 +0000 UTC m=+1593.088372551" lastFinishedPulling="2026-03-10 00:33:19.271938722 +0000 UTC m=+1613.445645461" observedRunningTime="2026-03-10 00:33:19.494926579 +0000 UTC m=+1613.668633338" watchObservedRunningTime="2026-03-10 00:33:19.501508987 +0000 UTC m=+1613.675215736" Mar 10 00:33:24 crc kubenswrapper[4994]: I0310 00:33:24.525022 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" event={"ID":"0730e042-f632-4db2-a694-b5917982d77d","Type":"ContainerStarted","Data":"c1a08e50b6d31c063393515002dd019ec4d9fcbf36d6f94b647697a1b6cbcb54"} Mar 10 00:33:24 crc kubenswrapper[4994]: I0310 00:33:24.550590 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-7457956966-kbwlx" podStartSLOduration=1.315657342 podStartE2EDuration="24.550563684s" podCreationTimestamp="2026-03-10 00:33:00 +0000 UTC" firstStartedPulling="2026-03-10 00:33:00.924114189 +0000 UTC m=+1595.097820938" lastFinishedPulling="2026-03-10 00:33:24.159020521 +0000 UTC m=+1618.332727280" observedRunningTime="2026-03-10 00:33:24.541318578 +0000 UTC m=+1618.715025407" watchObservedRunningTime="2026-03-10 00:33:24.550563684 +0000 UTC m=+1618.724270473" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.745828 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-kmjj2"] Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.747618 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.750110 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.750319 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-ngmz6" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.750483 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.750569 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.752089 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.752511 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.757234 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.765516 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-kmjj2"] Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.852621 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rdzr\" (UniqueName: \"kubernetes.io/projected/bd538cc5-49ab-4de4-b202-9068ffe969df-kube-api-access-5rdzr\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.852667 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-users\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.852693 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.852723 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.852745 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.852789 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-config\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.852835 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.954338 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.954410 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rdzr\" (UniqueName: \"kubernetes.io/projected/bd538cc5-49ab-4de4-b202-9068ffe969df-kube-api-access-5rdzr\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.954437 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-users\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.954466 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.954500 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.954526 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.954576 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-config\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.961529 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.962241 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.962373 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-users\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.962405 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.968325 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-config\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.989413 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:47 crc kubenswrapper[4994]: I0310 00:33:47.993010 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rdzr\" (UniqueName: \"kubernetes.io/projected/bd538cc5-49ab-4de4-b202-9068ffe969df-kube-api-access-5rdzr\") pod \"default-interconnect-68864d46cb-kmjj2\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:48 crc kubenswrapper[4994]: I0310 00:33:48.067618 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:33:48 crc kubenswrapper[4994]: I0310 00:33:48.265596 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-kmjj2"] Mar 10 00:33:48 crc kubenswrapper[4994]: W0310 00:33:48.273016 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd538cc5_49ab_4de4_b202_9068ffe969df.slice/crio-a70d05716712507b24f298fa27a033f55a9aa604b32f2cbef81d39d3a28695ec WatchSource:0}: Error finding container a70d05716712507b24f298fa27a033f55a9aa604b32f2cbef81d39d3a28695ec: Status 404 returned error can't find the container with id a70d05716712507b24f298fa27a033f55a9aa604b32f2cbef81d39d3a28695ec Mar 10 00:33:48 crc kubenswrapper[4994]: I0310 00:33:48.735750 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" event={"ID":"bd538cc5-49ab-4de4-b202-9068ffe969df","Type":"ContainerStarted","Data":"a70d05716712507b24f298fa27a033f55a9aa604b32f2cbef81d39d3a28695ec"} Mar 10 00:33:53 crc kubenswrapper[4994]: I0310 00:33:53.780084 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" event={"ID":"bd538cc5-49ab-4de4-b202-9068ffe969df","Type":"ContainerStarted","Data":"b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186"} Mar 10 00:33:53 crc kubenswrapper[4994]: I0310 00:33:53.803898 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" podStartSLOduration=1.7704750150000002 podStartE2EDuration="6.803866967s" podCreationTimestamp="2026-03-10 00:33:47 +0000 UTC" firstStartedPulling="2026-03-10 00:33:48.274469637 +0000 UTC m=+1642.448176386" lastFinishedPulling="2026-03-10 00:33:53.307861589 +0000 UTC m=+1647.481568338" observedRunningTime="2026-03-10 00:33:53.798571022 +0000 UTC m=+1647.972277841" watchObservedRunningTime="2026-03-10 00:33:53.803866967 +0000 UTC m=+1647.977573706" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.723588 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.728522 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.730834 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.731236 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-1" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.731488 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-8q24q" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.731555 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.731723 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.731917 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-2" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.732323 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.732362 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.732413 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.732436 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.855094 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.903532 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-749eb71b-ca9c-4295-a898-0f4a4ece462e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-749eb71b-ca9c-4295-a898-0f4a4ece462e\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.903851 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/69942722-a3c1-459b-96d3-260e0813093b-config-out\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904006 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-config\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904131 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904213 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-web-config\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904293 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904366 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904447 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/69942722-a3c1-459b-96d3-260e0813093b-tls-assets\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904520 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l68ww\" (UniqueName: \"kubernetes.io/projected/69942722-a3c1-459b-96d3-260e0813093b-kube-api-access-l68ww\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904592 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904683 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:57 crc kubenswrapper[4994]: I0310 00:33:57.904754 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005453 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-web-config\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005524 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005556 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005598 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/69942722-a3c1-459b-96d3-260e0813093b-tls-assets\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005624 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l68ww\" (UniqueName: \"kubernetes.io/projected/69942722-a3c1-459b-96d3-260e0813093b-kube-api-access-l68ww\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005657 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005707 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005738 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005781 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-749eb71b-ca9c-4295-a898-0f4a4ece462e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-749eb71b-ca9c-4295-a898-0f4a4ece462e\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005812 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/69942722-a3c1-459b-96d3-260e0813093b-config-out\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.005841 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-config\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.006020 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.007354 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: E0310 00:33:58.009007 4994 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 10 00:33:58 crc kubenswrapper[4994]: E0310 00:33:58.009238 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-secret-default-prometheus-proxy-tls podName:69942722-a3c1-459b-96d3-260e0813093b nodeName:}" failed. No retries permitted until 2026-03-10 00:33:58.509212845 +0000 UTC m=+1652.682919614 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "69942722-a3c1-459b-96d3-260e0813093b") : secret "default-prometheus-proxy-tls" not found Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.009833 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.010741 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.010924 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69942722-a3c1-459b-96d3-260e0813093b-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.014180 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/69942722-a3c1-459b-96d3-260e0813093b-config-out\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.014423 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.015565 4994 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.015723 4994 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-749eb71b-ca9c-4295-a898-0f4a4ece462e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-749eb71b-ca9c-4295-a898-0f4a4ece462e\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3bf73a2b0d38f9974f6fad8c70186349ac24419a9badade344c26d83828013f9/globalmount\"" pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.019536 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-config\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.019791 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-web-config\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.019937 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/69942722-a3c1-459b-96d3-260e0813093b-tls-assets\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.052355 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l68ww\" (UniqueName: \"kubernetes.io/projected/69942722-a3c1-459b-96d3-260e0813093b-kube-api-access-l68ww\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.058099 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-749eb71b-ca9c-4295-a898-0f4a4ece462e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-749eb71b-ca9c-4295-a898-0f4a4ece462e\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.512292 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.516436 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/69942722-a3c1-459b-96d3-260e0813093b-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"69942722-a3c1-459b-96d3-260e0813093b\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.659391 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 10 00:33:58 crc kubenswrapper[4994]: I0310 00:33:58.876797 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 10 00:33:59 crc kubenswrapper[4994]: I0310 00:33:59.847093 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"69942722-a3c1-459b-96d3-260e0813093b","Type":"ContainerStarted","Data":"f98cbb4276c3f97fb9ef091fa2e193efa3a690a6754100283e460849e17ceb76"} Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.122306 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551714-s79ft"] Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.127904 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551714-s79ft" Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.129741 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.130209 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.131289 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.135260 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551714-s79ft"] Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.234935 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-549f4\" (UniqueName: \"kubernetes.io/projected/3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c-kube-api-access-549f4\") pod \"auto-csr-approver-29551714-s79ft\" (UID: \"3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c\") " pod="openshift-infra/auto-csr-approver-29551714-s79ft" Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.336815 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-549f4\" (UniqueName: \"kubernetes.io/projected/3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c-kube-api-access-549f4\") pod \"auto-csr-approver-29551714-s79ft\" (UID: \"3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c\") " pod="openshift-infra/auto-csr-approver-29551714-s79ft" Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.359902 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-549f4\" (UniqueName: \"kubernetes.io/projected/3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c-kube-api-access-549f4\") pod \"auto-csr-approver-29551714-s79ft\" (UID: \"3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c\") " pod="openshift-infra/auto-csr-approver-29551714-s79ft" Mar 10 00:34:00 crc kubenswrapper[4994]: I0310 00:34:00.450821 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551714-s79ft" Mar 10 00:34:01 crc kubenswrapper[4994]: I0310 00:34:01.252449 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551714-s79ft"] Mar 10 00:34:01 crc kubenswrapper[4994]: I0310 00:34:01.863366 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551714-s79ft" event={"ID":"3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c","Type":"ContainerStarted","Data":"844f968fc65a43a5fe0017d514fe8a3ca83ad89726d636eec777ca0d9738de21"} Mar 10 00:34:03 crc kubenswrapper[4994]: I0310 00:34:03.884856 4994 generic.go:334] "Generic (PLEG): container finished" podID="3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c" containerID="55ae1bc05b680756a0fab6fc454424e48677cf98e3af7624cd80e10e8ec94e10" exitCode=0 Mar 10 00:34:03 crc kubenswrapper[4994]: I0310 00:34:03.884973 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551714-s79ft" event={"ID":"3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c","Type":"ContainerDied","Data":"55ae1bc05b680756a0fab6fc454424e48677cf98e3af7624cd80e10e8ec94e10"} Mar 10 00:34:03 crc kubenswrapper[4994]: I0310 00:34:03.888398 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"69942722-a3c1-459b-96d3-260e0813093b","Type":"ContainerStarted","Data":"3573b3eabc534c0ae6632ceb8b4c59c0de6af63257f8b399b8492eafdae4de23"} Mar 10 00:34:05 crc kubenswrapper[4994]: I0310 00:34:05.270715 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551714-s79ft" Mar 10 00:34:05 crc kubenswrapper[4994]: I0310 00:34:05.409542 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-549f4\" (UniqueName: \"kubernetes.io/projected/3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c-kube-api-access-549f4\") pod \"3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c\" (UID: \"3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c\") " Mar 10 00:34:05 crc kubenswrapper[4994]: I0310 00:34:05.421016 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c-kube-api-access-549f4" (OuterVolumeSpecName: "kube-api-access-549f4") pod "3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c" (UID: "3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c"). InnerVolumeSpecName "kube-api-access-549f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:34:05 crc kubenswrapper[4994]: I0310 00:34:05.511768 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-549f4\" (UniqueName: \"kubernetes.io/projected/3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c-kube-api-access-549f4\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:05 crc kubenswrapper[4994]: I0310 00:34:05.909417 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551714-s79ft" event={"ID":"3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c","Type":"ContainerDied","Data":"844f968fc65a43a5fe0017d514fe8a3ca83ad89726d636eec777ca0d9738de21"} Mar 10 00:34:05 crc kubenswrapper[4994]: I0310 00:34:05.909481 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="844f968fc65a43a5fe0017d514fe8a3ca83ad89726d636eec777ca0d9738de21" Mar 10 00:34:05 crc kubenswrapper[4994]: I0310 00:34:05.909496 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551714-s79ft" Mar 10 00:34:06 crc kubenswrapper[4994]: I0310 00:34:06.347688 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551708-mcgcl"] Mar 10 00:34:06 crc kubenswrapper[4994]: I0310 00:34:06.357437 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551708-mcgcl"] Mar 10 00:34:06 crc kubenswrapper[4994]: I0310 00:34:06.566718 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79b6ae72-9c1a-4191-84af-d06b0155e244" path="/var/lib/kubelet/pods/79b6ae72-9c1a-4191-84af-d06b0155e244/volumes" Mar 10 00:34:07 crc kubenswrapper[4994]: I0310 00:34:07.494953 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-fzqv5"] Mar 10 00:34:07 crc kubenswrapper[4994]: E0310 00:34:07.495239 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c" containerName="oc" Mar 10 00:34:07 crc kubenswrapper[4994]: I0310 00:34:07.495255 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c" containerName="oc" Mar 10 00:34:07 crc kubenswrapper[4994]: I0310 00:34:07.495385 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c" containerName="oc" Mar 10 00:34:07 crc kubenswrapper[4994]: I0310 00:34:07.495797 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-fzqv5" Mar 10 00:34:07 crc kubenswrapper[4994]: I0310 00:34:07.555804 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-fzqv5"] Mar 10 00:34:07 crc kubenswrapper[4994]: I0310 00:34:07.640290 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5njjn\" (UniqueName: \"kubernetes.io/projected/8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b-kube-api-access-5njjn\") pod \"default-snmp-webhook-6856cfb745-fzqv5\" (UID: \"8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-fzqv5" Mar 10 00:34:07 crc kubenswrapper[4994]: I0310 00:34:07.741349 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5njjn\" (UniqueName: \"kubernetes.io/projected/8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b-kube-api-access-5njjn\") pod \"default-snmp-webhook-6856cfb745-fzqv5\" (UID: \"8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-fzqv5" Mar 10 00:34:07 crc kubenswrapper[4994]: I0310 00:34:07.761674 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5njjn\" (UniqueName: \"kubernetes.io/projected/8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b-kube-api-access-5njjn\") pod \"default-snmp-webhook-6856cfb745-fzqv5\" (UID: \"8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-fzqv5" Mar 10 00:34:07 crc kubenswrapper[4994]: I0310 00:34:07.861486 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-fzqv5" Mar 10 00:34:08 crc kubenswrapper[4994]: I0310 00:34:08.110321 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-fzqv5"] Mar 10 00:34:08 crc kubenswrapper[4994]: W0310 00:34:08.126756 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e6cb6c2_b4dc_41ad_83dc_63de94ec3b6b.slice/crio-266f1d0d272a751363a358ab288f8d8806ffcd76fb0de433b496be8b9125c9b3 WatchSource:0}: Error finding container 266f1d0d272a751363a358ab288f8d8806ffcd76fb0de433b496be8b9125c9b3: Status 404 returned error can't find the container with id 266f1d0d272a751363a358ab288f8d8806ffcd76fb0de433b496be8b9125c9b3 Mar 10 00:34:08 crc kubenswrapper[4994]: I0310 00:34:08.931118 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-fzqv5" event={"ID":"8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b","Type":"ContainerStarted","Data":"266f1d0d272a751363a358ab288f8d8806ffcd76fb0de433b496be8b9125c9b3"} Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.874911 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.876718 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.882588 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.882713 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.882750 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-djsrm" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.882782 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.882788 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.886171 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.892540 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.945088 4994 generic.go:334] "Generic (PLEG): container finished" podID="69942722-a3c1-459b-96d3-260e0813093b" containerID="3573b3eabc534c0ae6632ceb8b4c59c0de6af63257f8b399b8492eafdae4de23" exitCode=0 Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.945125 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"69942722-a3c1-459b-96d3-260e0813093b","Type":"ContainerDied","Data":"3573b3eabc534c0ae6632ceb8b4c59c0de6af63257f8b399b8492eafdae4de23"} Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.996623 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bhsr\" (UniqueName: \"kubernetes.io/projected/bd991a1f-d471-40c4-919f-75400e047b5d-kube-api-access-9bhsr\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.996692 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.996712 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.996731 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.996771 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bd991a1f-d471-40c4-919f-75400e047b5d-config-out\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.996793 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-367f0112-d331-46f6-9afc-7958b67f370c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-367f0112-d331-46f6-9afc-7958b67f370c\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.996811 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bd991a1f-d471-40c4-919f-75400e047b5d-tls-assets\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.996884 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-web-config\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:10 crc kubenswrapper[4994]: I0310 00:34:10.996915 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-config-volume\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.098646 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bhsr\" (UniqueName: \"kubernetes.io/projected/bd991a1f-d471-40c4-919f-75400e047b5d-kube-api-access-9bhsr\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.098709 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.098739 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.098769 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.098816 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bd991a1f-d471-40c4-919f-75400e047b5d-config-out\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.098847 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-367f0112-d331-46f6-9afc-7958b67f370c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-367f0112-d331-46f6-9afc-7958b67f370c\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.098886 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bd991a1f-d471-40c4-919f-75400e047b5d-tls-assets\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.098940 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-web-config\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.098970 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-config-volume\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: E0310 00:34:11.099045 4994 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 10 00:34:11 crc kubenswrapper[4994]: E0310 00:34:11.099148 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls podName:bd991a1f-d471-40c4-919f-75400e047b5d nodeName:}" failed. No retries permitted until 2026-03-10 00:34:11.599127054 +0000 UTC m=+1665.772833803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "bd991a1f-d471-40c4-919f-75400e047b5d") : secret "default-alertmanager-proxy-tls" not found Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.104762 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bd991a1f-d471-40c4-919f-75400e047b5d-config-out\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.105042 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bd991a1f-d471-40c4-919f-75400e047b5d-tls-assets\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.105181 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-web-config\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.105414 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.108446 4994 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.108535 4994 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-367f0112-d331-46f6-9afc-7958b67f370c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-367f0112-d331-46f6-9afc-7958b67f370c\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/70776db84b8166f32938d777b455eaeb195500fe20856cd31d03188fb0ad0492/globalmount\"" pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.115108 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-config-volume\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.115970 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.123943 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bhsr\" (UniqueName: \"kubernetes.io/projected/bd991a1f-d471-40c4-919f-75400e047b5d-kube-api-access-9bhsr\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.134184 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-367f0112-d331-46f6-9afc-7958b67f370c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-367f0112-d331-46f6-9afc-7958b67f370c\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: I0310 00:34:11.605097 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:11 crc kubenswrapper[4994]: E0310 00:34:11.606315 4994 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 10 00:34:11 crc kubenswrapper[4994]: E0310 00:34:11.606364 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls podName:bd991a1f-d471-40c4-919f-75400e047b5d nodeName:}" failed. No retries permitted until 2026-03-10 00:34:12.606349589 +0000 UTC m=+1666.780056338 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "bd991a1f-d471-40c4-919f-75400e047b5d") : secret "default-alertmanager-proxy-tls" not found Mar 10 00:34:12 crc kubenswrapper[4994]: I0310 00:34:12.619269 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:12 crc kubenswrapper[4994]: E0310 00:34:12.619672 4994 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 10 00:34:12 crc kubenswrapper[4994]: E0310 00:34:12.619780 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls podName:bd991a1f-d471-40c4-919f-75400e047b5d nodeName:}" failed. No retries permitted until 2026-03-10 00:34:14.619750203 +0000 UTC m=+1668.793456982 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "bd991a1f-d471-40c4-919f-75400e047b5d") : secret "default-alertmanager-proxy-tls" not found Mar 10 00:34:14 crc kubenswrapper[4994]: I0310 00:34:14.648499 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:14 crc kubenswrapper[4994]: I0310 00:34:14.655983 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd991a1f-d471-40c4-919f-75400e047b5d-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"bd991a1f-d471-40c4-919f-75400e047b5d\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:14 crc kubenswrapper[4994]: I0310 00:34:14.884988 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 10 00:34:15 crc kubenswrapper[4994]: I0310 00:34:15.866904 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 10 00:34:15 crc kubenswrapper[4994]: I0310 00:34:15.998102 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"bd991a1f-d471-40c4-919f-75400e047b5d","Type":"ContainerStarted","Data":"2ae45c0efca905bcdb1a1dbe21cf60c0378b7a9c5b331c7bee9fbabfa6bc01ce"} Mar 10 00:34:17 crc kubenswrapper[4994]: I0310 00:34:17.006072 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-fzqv5" event={"ID":"8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b","Type":"ContainerStarted","Data":"c5143e3e63c3da41e9465f29e6534abe1fb3747e2848cdb4618517be3fbdac84"} Mar 10 00:34:17 crc kubenswrapper[4994]: I0310 00:34:17.029704 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-6856cfb745-fzqv5" podStartSLOduration=2.350913972 podStartE2EDuration="10.029683746s" podCreationTimestamp="2026-03-10 00:34:07 +0000 UTC" firstStartedPulling="2026-03-10 00:34:08.128848409 +0000 UTC m=+1662.302555158" lastFinishedPulling="2026-03-10 00:34:15.807618183 +0000 UTC m=+1669.981324932" observedRunningTime="2026-03-10 00:34:17.027194572 +0000 UTC m=+1671.200901341" watchObservedRunningTime="2026-03-10 00:34:17.029683746 +0000 UTC m=+1671.203390495" Mar 10 00:34:18 crc kubenswrapper[4994]: I0310 00:34:18.016277 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"bd991a1f-d471-40c4-919f-75400e047b5d","Type":"ContainerStarted","Data":"fd3d6b6bb77479ee7e5d2a48a2c1037c6305df2cbfacc9b60b82d0f19e9117de"} Mar 10 00:34:21 crc kubenswrapper[4994]: I0310 00:34:21.038881 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"69942722-a3c1-459b-96d3-260e0813093b","Type":"ContainerStarted","Data":"4c8dcbd81dbc3165ff83337d1eee59d8f9fba78c944471291cc50be2ea945974"} Mar 10 00:34:22 crc kubenswrapper[4994]: I0310 00:34:22.949800 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl"] Mar 10 00:34:22 crc kubenswrapper[4994]: I0310 00:34:22.960491 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:22 crc kubenswrapper[4994]: I0310 00:34:22.962593 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-xrmnc" Mar 10 00:34:22 crc kubenswrapper[4994]: I0310 00:34:22.963558 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Mar 10 00:34:22 crc kubenswrapper[4994]: I0310 00:34:22.963731 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Mar 10 00:34:22 crc kubenswrapper[4994]: I0310 00:34:22.964009 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Mar 10 00:34:22 crc kubenswrapper[4994]: I0310 00:34:22.993207 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl"] Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.058533 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"69942722-a3c1-459b-96d3-260e0813093b","Type":"ContainerStarted","Data":"7cfe95d3958f7e00f3e16968e8dbc37f4b37a628b58fbc719b73147dc3fa4d39"} Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.080301 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/ba63838d-f012-4322-afa9-d46cb2387ae8-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.080522 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/ba63838d-f012-4322-afa9-d46cb2387ae8-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.080693 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.080827 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lnn6\" (UniqueName: \"kubernetes.io/projected/ba63838d-f012-4322-afa9-d46cb2387ae8-kube-api-access-6lnn6\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.080882 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.182704 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lnn6\" (UniqueName: \"kubernetes.io/projected/ba63838d-f012-4322-afa9-d46cb2387ae8-kube-api-access-6lnn6\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.182757 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.182786 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/ba63838d-f012-4322-afa9-d46cb2387ae8-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.182857 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/ba63838d-f012-4322-afa9-d46cb2387ae8-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.182929 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: E0310 00:34:23.183075 4994 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 10 00:34:23 crc kubenswrapper[4994]: E0310 00:34:23.183150 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-default-cloud1-coll-meter-proxy-tls podName:ba63838d-f012-4322-afa9-d46cb2387ae8 nodeName:}" failed. No retries permitted until 2026-03-10 00:34:23.683129858 +0000 UTC m=+1677.856836617 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" (UID: "ba63838d-f012-4322-afa9-d46cb2387ae8") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.183405 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/ba63838d-f012-4322-afa9-d46cb2387ae8-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.183902 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/ba63838d-f012-4322-afa9-d46cb2387ae8-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.198087 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.199660 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lnn6\" (UniqueName: \"kubernetes.io/projected/ba63838d-f012-4322-afa9-d46cb2387ae8-kube-api-access-6lnn6\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: I0310 00:34:23.688265 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:23 crc kubenswrapper[4994]: E0310 00:34:23.688402 4994 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 10 00:34:23 crc kubenswrapper[4994]: E0310 00:34:23.688450 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-default-cloud1-coll-meter-proxy-tls podName:ba63838d-f012-4322-afa9-d46cb2387ae8 nodeName:}" failed. No retries permitted until 2026-03-10 00:34:24.688435374 +0000 UTC m=+1678.862142123 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" (UID: "ba63838d-f012-4322-afa9-d46cb2387ae8") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 10 00:34:24 crc kubenswrapper[4994]: I0310 00:34:24.701534 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:24 crc kubenswrapper[4994]: I0310 00:34:24.705944 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba63838d-f012-4322-afa9-d46cb2387ae8-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl\" (UID: \"ba63838d-f012-4322-afa9-d46cb2387ae8\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:24 crc kubenswrapper[4994]: I0310 00:34:24.788331 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.014424 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn"] Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.017055 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.020249 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.020506 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.031277 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn"] Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.083520 4994 generic.go:334] "Generic (PLEG): container finished" podID="bd991a1f-d471-40c4-919f-75400e047b5d" containerID="fd3d6b6bb77479ee7e5d2a48a2c1037c6305df2cbfacc9b60b82d0f19e9117de" exitCode=0 Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.083558 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"bd991a1f-d471-40c4-919f-75400e047b5d","Type":"ContainerDied","Data":"fd3d6b6bb77479ee7e5d2a48a2c1037c6305df2cbfacc9b60b82d0f19e9117de"} Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.120523 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6rvl\" (UniqueName: \"kubernetes.io/projected/71658d15-ee94-436a-8266-e6ef3680d0f0-kube-api-access-s6rvl\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.120633 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.120680 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.120724 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/71658d15-ee94-436a-8266-e6ef3680d0f0-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.120744 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/71658d15-ee94-436a-8266-e6ef3680d0f0-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.221689 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.221835 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.221890 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/71658d15-ee94-436a-8266-e6ef3680d0f0-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.221934 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/71658d15-ee94-436a-8266-e6ef3680d0f0-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.221968 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6rvl\" (UniqueName: \"kubernetes.io/projected/71658d15-ee94-436a-8266-e6ef3680d0f0-kube-api-access-s6rvl\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: E0310 00:34:26.222594 4994 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 10 00:34:26 crc kubenswrapper[4994]: E0310 00:34:26.222655 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-default-cloud1-ceil-meter-proxy-tls podName:71658d15-ee94-436a-8266-e6ef3680d0f0 nodeName:}" failed. No retries permitted until 2026-03-10 00:34:26.722636491 +0000 UTC m=+1680.896343350 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" (UID: "71658d15-ee94-436a-8266-e6ef3680d0f0") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.223087 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/71658d15-ee94-436a-8266-e6ef3680d0f0-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.223139 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/71658d15-ee94-436a-8266-e6ef3680d0f0-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.242547 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.249653 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6rvl\" (UniqueName: \"kubernetes.io/projected/71658d15-ee94-436a-8266-e6ef3680d0f0-kube-api-access-s6rvl\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: I0310 00:34:26.729525 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:26 crc kubenswrapper[4994]: E0310 00:34:26.729674 4994 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 10 00:34:26 crc kubenswrapper[4994]: E0310 00:34:26.729934 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-default-cloud1-ceil-meter-proxy-tls podName:71658d15-ee94-436a-8266-e6ef3680d0f0 nodeName:}" failed. No retries permitted until 2026-03-10 00:34:27.729917409 +0000 UTC m=+1681.903624158 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" (UID: "71658d15-ee94-436a-8266-e6ef3680d0f0") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 10 00:34:27 crc kubenswrapper[4994]: I0310 00:34:27.745772 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:27 crc kubenswrapper[4994]: I0310 00:34:27.762781 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71658d15-ee94-436a-8266-e6ef3680d0f0-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn\" (UID: \"71658d15-ee94-436a-8266-e6ef3680d0f0\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:27 crc kubenswrapper[4994]: I0310 00:34:27.833588 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.009105 4994 scope.go:117] "RemoveContainer" containerID="76e4652e3cbbbc7dd950c967558f6551927fe404fc62cca88c145579e5829da9" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.234364 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl"] Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.237029 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.239905 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.240309 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.243389 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl"] Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.269261 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzss5\" (UniqueName: \"kubernetes.io/projected/eb6db775-6577-4ad6-90db-1fffd09b924b-kube-api-access-lzss5\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.269339 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/eb6db775-6577-4ad6-90db-1fffd09b924b-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.269373 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.269512 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.269566 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb6db775-6577-4ad6-90db-1fffd09b924b-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.372810 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.372885 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb6db775-6577-4ad6-90db-1fffd09b924b-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.372919 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzss5\" (UniqueName: \"kubernetes.io/projected/eb6db775-6577-4ad6-90db-1fffd09b924b-kube-api-access-lzss5\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.372953 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/eb6db775-6577-4ad6-90db-1fffd09b924b-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.372979 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: E0310 00:34:29.373108 4994 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 10 00:34:29 crc kubenswrapper[4994]: E0310 00:34:29.373160 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-default-cloud1-sens-meter-proxy-tls podName:eb6db775-6577-4ad6-90db-1fffd09b924b nodeName:}" failed. No retries permitted until 2026-03-10 00:34:29.873142247 +0000 UTC m=+1684.046848996 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" (UID: "eb6db775-6577-4ad6-90db-1fffd09b924b") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.377476 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.377739 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb6db775-6577-4ad6-90db-1fffd09b924b-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.378494 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/eb6db775-6577-4ad6-90db-1fffd09b924b-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.393506 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzss5\" (UniqueName: \"kubernetes.io/projected/eb6db775-6577-4ad6-90db-1fffd09b924b-kube-api-access-lzss5\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: I0310 00:34:29.878127 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:29 crc kubenswrapper[4994]: E0310 00:34:29.878378 4994 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 10 00:34:29 crc kubenswrapper[4994]: E0310 00:34:29.878486 4994 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-default-cloud1-sens-meter-proxy-tls podName:eb6db775-6577-4ad6-90db-1fffd09b924b nodeName:}" failed. No retries permitted until 2026-03-10 00:34:30.878456263 +0000 UTC m=+1685.052163042 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" (UID: "eb6db775-6577-4ad6-90db-1fffd09b924b") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 10 00:34:30 crc kubenswrapper[4994]: I0310 00:34:30.897722 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:30 crc kubenswrapper[4994]: I0310 00:34:30.914358 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb6db775-6577-4ad6-90db-1fffd09b924b-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl\" (UID: \"eb6db775-6577-4ad6-90db-1fffd09b924b\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:31 crc kubenswrapper[4994]: I0310 00:34:31.056015 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" Mar 10 00:34:33 crc kubenswrapper[4994]: I0310 00:34:33.061420 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl"] Mar 10 00:34:33 crc kubenswrapper[4994]: W0310 00:34:33.069255 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb6db775_6577_4ad6_90db_1fffd09b924b.slice/crio-25642a0d6375eeeb023db7e18c0e5dcfe4b7d2646edd565df60dde9ae75c570d WatchSource:0}: Error finding container 25642a0d6375eeeb023db7e18c0e5dcfe4b7d2646edd565df60dde9ae75c570d: Status 404 returned error can't find the container with id 25642a0d6375eeeb023db7e18c0e5dcfe4b7d2646edd565df60dde9ae75c570d Mar 10 00:34:33 crc kubenswrapper[4994]: I0310 00:34:33.113203 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl"] Mar 10 00:34:33 crc kubenswrapper[4994]: I0310 00:34:33.125253 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn"] Mar 10 00:34:33 crc kubenswrapper[4994]: W0310 00:34:33.140841 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71658d15_ee94_436a_8266_e6ef3680d0f0.slice/crio-4970eb399bb870387b2d70df83ed093674b56d4d7f7d107ec1d6c7d1465cf6d6 WatchSource:0}: Error finding container 4970eb399bb870387b2d70df83ed093674b56d4d7f7d107ec1d6c7d1465cf6d6: Status 404 returned error can't find the container with id 4970eb399bb870387b2d70df83ed093674b56d4d7f7d107ec1d6c7d1465cf6d6 Mar 10 00:34:33 crc kubenswrapper[4994]: I0310 00:34:33.149518 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"69942722-a3c1-459b-96d3-260e0813093b","Type":"ContainerStarted","Data":"cc76e7036a4a1e7a0ec7bfa5f38f0377cd47c300fdae995408476050e36061f4"} Mar 10 00:34:33 crc kubenswrapper[4994]: I0310 00:34:33.152627 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" event={"ID":"eb6db775-6577-4ad6-90db-1fffd09b924b","Type":"ContainerStarted","Data":"25642a0d6375eeeb023db7e18c0e5dcfe4b7d2646edd565df60dde9ae75c570d"} Mar 10 00:34:33 crc kubenswrapper[4994]: I0310 00:34:33.172357 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=3.37955234 podStartE2EDuration="37.172334315s" podCreationTimestamp="2026-03-10 00:33:56 +0000 UTC" firstStartedPulling="2026-03-10 00:33:58.892226944 +0000 UTC m=+1653.065933693" lastFinishedPulling="2026-03-10 00:34:32.685008919 +0000 UTC m=+1686.858715668" observedRunningTime="2026-03-10 00:34:33.168156538 +0000 UTC m=+1687.341863307" watchObservedRunningTime="2026-03-10 00:34:33.172334315 +0000 UTC m=+1687.346041084" Mar 10 00:34:33 crc kubenswrapper[4994]: I0310 00:34:33.660120 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Mar 10 00:34:34 crc kubenswrapper[4994]: I0310 00:34:34.159805 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" event={"ID":"ba63838d-f012-4322-afa9-d46cb2387ae8","Type":"ContainerStarted","Data":"b5ec25fcda84df8c51f85ac77064fcfb723d1c6b168a72edeed43f24a89f1f4f"} Mar 10 00:34:34 crc kubenswrapper[4994]: I0310 00:34:34.161404 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" event={"ID":"71658d15-ee94-436a-8266-e6ef3680d0f0","Type":"ContainerStarted","Data":"4970eb399bb870387b2d70df83ed093674b56d4d7f7d107ec1d6c7d1465cf6d6"} Mar 10 00:34:37 crc kubenswrapper[4994]: I0310 00:34:37.225698 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" event={"ID":"ba63838d-f012-4322-afa9-d46cb2387ae8","Type":"ContainerStarted","Data":"2ffe9e746d1f5e3dc40721ca6b65cac0335f05a6b67509907e9c69f144f050aa"} Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.239769 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" event={"ID":"71658d15-ee94-436a-8266-e6ef3680d0f0","Type":"ContainerStarted","Data":"53f62f7c35abd11f85b8e6021296d4d751874456f6783b02d540640a361fb444"} Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.240153 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" event={"ID":"71658d15-ee94-436a-8266-e6ef3680d0f0","Type":"ContainerStarted","Data":"4d5e4ee3ba96e2dd90d73b0be0ac50c0ec3a639d8e271a43ada0f3e3ebd65d64"} Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.241948 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" event={"ID":"eb6db775-6577-4ad6-90db-1fffd09b924b","Type":"ContainerStarted","Data":"f03960de7e0eee16beff7087a2835cba9b98d37f674a392e4a72e7b02877ae4c"} Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.244430 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"bd991a1f-d471-40c4-919f-75400e047b5d","Type":"ContainerStarted","Data":"a013a4322541af7a724ffa0f8a8b9913fc8612cfdff1a246b94f3311c0720c6f"} Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.406729 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k"] Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.408573 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.412292 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.412520 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.419722 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k"] Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.520201 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/945b9546-8486-448b-a7b8-ec76634ff030-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.520280 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws2sp\" (UniqueName: \"kubernetes.io/projected/945b9546-8486-448b-a7b8-ec76634ff030-kube-api-access-ws2sp\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.520318 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/945b9546-8486-448b-a7b8-ec76634ff030-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.520341 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/945b9546-8486-448b-a7b8-ec76634ff030-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.623146 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/945b9546-8486-448b-a7b8-ec76634ff030-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.623205 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/945b9546-8486-448b-a7b8-ec76634ff030-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.623824 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/945b9546-8486-448b-a7b8-ec76634ff030-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.623924 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/945b9546-8486-448b-a7b8-ec76634ff030-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.624709 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/945b9546-8486-448b-a7b8-ec76634ff030-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.624855 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws2sp\" (UniqueName: \"kubernetes.io/projected/945b9546-8486-448b-a7b8-ec76634ff030-kube-api-access-ws2sp\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.750371 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/945b9546-8486-448b-a7b8-ec76634ff030-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:38 crc kubenswrapper[4994]: I0310 00:34:38.750380 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws2sp\" (UniqueName: \"kubernetes.io/projected/945b9546-8486-448b-a7b8-ec76634ff030-kube-api-access-ws2sp\") pod \"default-cloud1-coll-event-smartgateway-b85695595-kd89k\" (UID: \"945b9546-8486-448b-a7b8-ec76634ff030\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.029087 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.270320 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" event={"ID":"ba63838d-f012-4322-afa9-d46cb2387ae8","Type":"ContainerStarted","Data":"e6068625af9c2045ef46c0df0c506de4bbc03d0c2d17fa5809ae88cbfbacc695"} Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.273845 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" event={"ID":"eb6db775-6577-4ad6-90db-1fffd09b924b","Type":"ContainerStarted","Data":"8b70a93015d9609790745f53b72f74b1cd43d650efee4dc67311f62ad75287ab"} Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.293327 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"bd991a1f-d471-40c4-919f-75400e047b5d","Type":"ContainerStarted","Data":"082a50713611f09a394395ce180245b195a8de5faa73b5061d20daaf3a5a4410"} Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.450468 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k"] Mar 10 00:34:39 crc kubenswrapper[4994]: W0310 00:34:39.484522 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod945b9546_8486_448b_a7b8_ec76634ff030.slice/crio-c5eb670c43ca30ca91bc7d92b6a0fa1a3f43f65a23770ef94e5f5e0b400cf662 WatchSource:0}: Error finding container c5eb670c43ca30ca91bc7d92b6a0fa1a3f43f65a23770ef94e5f5e0b400cf662: Status 404 returned error can't find the container with id c5eb670c43ca30ca91bc7d92b6a0fa1a3f43f65a23770ef94e5f5e0b400cf662 Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.624833 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6"] Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.625792 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.630772 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.640345 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6"] Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.741628 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c50e95d0-ef8f-4355-ba85-156de14a4408-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.741687 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/c50e95d0-ef8f-4355-ba85-156de14a4408-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.741713 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c50e95d0-ef8f-4355-ba85-156de14a4408-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.741773 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chb9t\" (UniqueName: \"kubernetes.io/projected/c50e95d0-ef8f-4355-ba85-156de14a4408-kube-api-access-chb9t\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.842538 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chb9t\" (UniqueName: \"kubernetes.io/projected/c50e95d0-ef8f-4355-ba85-156de14a4408-kube-api-access-chb9t\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.842602 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c50e95d0-ef8f-4355-ba85-156de14a4408-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.842634 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/c50e95d0-ef8f-4355-ba85-156de14a4408-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.842662 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c50e95d0-ef8f-4355-ba85-156de14a4408-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.843162 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c50e95d0-ef8f-4355-ba85-156de14a4408-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.843618 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c50e95d0-ef8f-4355-ba85-156de14a4408-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.852555 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/c50e95d0-ef8f-4355-ba85-156de14a4408-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.858144 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chb9t\" (UniqueName: \"kubernetes.io/projected/c50e95d0-ef8f-4355-ba85-156de14a4408-kube-api-access-chb9t\") pod \"default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6\" (UID: \"c50e95d0-ef8f-4355-ba85-156de14a4408\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:39 crc kubenswrapper[4994]: I0310 00:34:39.964354 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" Mar 10 00:34:40 crc kubenswrapper[4994]: I0310 00:34:40.303094 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" event={"ID":"945b9546-8486-448b-a7b8-ec76634ff030","Type":"ContainerStarted","Data":"c84531a4e93d62654ebb196f19936778850438953ec8673d6fafbb0edd8785a6"} Mar 10 00:34:40 crc kubenswrapper[4994]: I0310 00:34:40.303373 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" event={"ID":"945b9546-8486-448b-a7b8-ec76634ff030","Type":"ContainerStarted","Data":"c5eb670c43ca30ca91bc7d92b6a0fa1a3f43f65a23770ef94e5f5e0b400cf662"} Mar 10 00:34:40 crc kubenswrapper[4994]: I0310 00:34:40.308288 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"bd991a1f-d471-40c4-919f-75400e047b5d","Type":"ContainerStarted","Data":"03523a8a2f91cfcd77bb522b38e12cb89e802728e9024e8ff4d7271b4af233b2"} Mar 10 00:34:40 crc kubenswrapper[4994]: I0310 00:34:40.329241 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=18.009505292 podStartE2EDuration="31.329172167s" podCreationTimestamp="2026-03-10 00:34:09 +0000 UTC" firstStartedPulling="2026-03-10 00:34:26.086944871 +0000 UTC m=+1680.260651620" lastFinishedPulling="2026-03-10 00:34:39.406611746 +0000 UTC m=+1693.580318495" observedRunningTime="2026-03-10 00:34:40.325155954 +0000 UTC m=+1694.498862703" watchObservedRunningTime="2026-03-10 00:34:40.329172167 +0000 UTC m=+1694.502878916" Mar 10 00:34:43 crc kubenswrapper[4994]: I0310 00:34:43.660184 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Mar 10 00:34:43 crc kubenswrapper[4994]: I0310 00:34:43.712865 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Mar 10 00:34:44 crc kubenswrapper[4994]: I0310 00:34:44.423601 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Mar 10 00:34:48 crc kubenswrapper[4994]: I0310 00:34:48.892544 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:34:48 crc kubenswrapper[4994]: I0310 00:34:48.898223 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:34:49 crc kubenswrapper[4994]: W0310 00:34:49.792350 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc50e95d0_ef8f_4355_ba85_156de14a4408.slice/crio-1a403c668e4a687bbc082011ed98891ea63192f1942f116c4f98c07a98b9f683 WatchSource:0}: Error finding container 1a403c668e4a687bbc082011ed98891ea63192f1942f116c4f98c07a98b9f683: Status 404 returned error can't find the container with id 1a403c668e4a687bbc082011ed98891ea63192f1942f116c4f98c07a98b9f683 Mar 10 00:34:49 crc kubenswrapper[4994]: I0310 00:34:49.792389 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6"] Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.415200 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" event={"ID":"ba63838d-f012-4322-afa9-d46cb2387ae8","Type":"ContainerStarted","Data":"fe370f35fd4a308131e3c1f4033bcafe043e97cb9c38019ee3088a1c73df5b11"} Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.417705 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" event={"ID":"71658d15-ee94-436a-8266-e6ef3680d0f0","Type":"ContainerStarted","Data":"013632f070d3c54dbd8158d7b24414c731cd77d54e78eb1de5beab4bf18b85d1"} Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.420093 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" event={"ID":"945b9546-8486-448b-a7b8-ec76634ff030","Type":"ContainerStarted","Data":"390f403d95c0494514ef6579e3128b59e525fce1f8ee3b60486bcfe8ee30637b"} Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.422625 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" event={"ID":"eb6db775-6577-4ad6-90db-1fffd09b924b","Type":"ContainerStarted","Data":"3c67a501c42f43ed2776494dabfbacee8dfff7375a998da6a3e29927d0ddb447"} Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.424581 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" event={"ID":"c50e95d0-ef8f-4355-ba85-156de14a4408","Type":"ContainerStarted","Data":"5737cbac03bf4951a511b42d7dfd30e6838483f53c97da7a1ed71db32c21afa1"} Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.424617 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" event={"ID":"c50e95d0-ef8f-4355-ba85-156de14a4408","Type":"ContainerStarted","Data":"9acae11a5f98bb0e9dd9df4d9e9005d260dc85f7f3616aa89360f47e34d6bf50"} Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.424651 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" event={"ID":"c50e95d0-ef8f-4355-ba85-156de14a4408","Type":"ContainerStarted","Data":"1a403c668e4a687bbc082011ed98891ea63192f1942f116c4f98c07a98b9f683"} Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.436715 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" podStartSLOduration=12.138287996 podStartE2EDuration="28.436692842s" podCreationTimestamp="2026-03-10 00:34:22 +0000 UTC" firstStartedPulling="2026-03-10 00:34:33.139929476 +0000 UTC m=+1687.313636225" lastFinishedPulling="2026-03-10 00:34:49.438334322 +0000 UTC m=+1703.612041071" observedRunningTime="2026-03-10 00:34:50.431378036 +0000 UTC m=+1704.605084805" watchObservedRunningTime="2026-03-10 00:34:50.436692842 +0000 UTC m=+1704.610399601" Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.478680 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" podStartSLOduration=11.136160743 podStartE2EDuration="11.478658675s" podCreationTimestamp="2026-03-10 00:34:39 +0000 UTC" firstStartedPulling="2026-03-10 00:34:49.795002106 +0000 UTC m=+1703.968708855" lastFinishedPulling="2026-03-10 00:34:50.137500038 +0000 UTC m=+1704.311206787" observedRunningTime="2026-03-10 00:34:50.477360971 +0000 UTC m=+1704.651067720" watchObservedRunningTime="2026-03-10 00:34:50.478658675 +0000 UTC m=+1704.652365424" Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.482967 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" podStartSLOduration=2.544697501 podStartE2EDuration="12.482959655s" podCreationTimestamp="2026-03-10 00:34:38 +0000 UTC" firstStartedPulling="2026-03-10 00:34:39.500041507 +0000 UTC m=+1693.673748266" lastFinishedPulling="2026-03-10 00:34:49.438303661 +0000 UTC m=+1703.612010420" observedRunningTime="2026-03-10 00:34:50.458710395 +0000 UTC m=+1704.632417154" watchObservedRunningTime="2026-03-10 00:34:50.482959655 +0000 UTC m=+1704.656666404" Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.501016 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" podStartSLOduration=9.169570077 podStartE2EDuration="25.500995397s" podCreationTimestamp="2026-03-10 00:34:25 +0000 UTC" firstStartedPulling="2026-03-10 00:34:33.148566767 +0000 UTC m=+1687.322273516" lastFinishedPulling="2026-03-10 00:34:49.479992087 +0000 UTC m=+1703.653698836" observedRunningTime="2026-03-10 00:34:50.495504676 +0000 UTC m=+1704.669211425" watchObservedRunningTime="2026-03-10 00:34:50.500995397 +0000 UTC m=+1704.674702136" Mar 10 00:34:50 crc kubenswrapper[4994]: I0310 00:34:50.525672 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" podStartSLOduration=5.196874735 podStartE2EDuration="21.525655228s" podCreationTimestamp="2026-03-10 00:34:29 +0000 UTC" firstStartedPulling="2026-03-10 00:34:33.072396758 +0000 UTC m=+1687.246103507" lastFinishedPulling="2026-03-10 00:34:49.401177251 +0000 UTC m=+1703.574884000" observedRunningTime="2026-03-10 00:34:50.525523633 +0000 UTC m=+1704.699230422" watchObservedRunningTime="2026-03-10 00:34:50.525655228 +0000 UTC m=+1704.699361977" Mar 10 00:34:56 crc kubenswrapper[4994]: I0310 00:34:56.240159 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-kmjj2"] Mar 10 00:34:56 crc kubenswrapper[4994]: I0310 00:34:56.240730 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" podUID="bd538cc5-49ab-4de4-b202-9068ffe969df" containerName="default-interconnect" containerID="cri-o://b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186" gracePeriod=30 Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.117762 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.219768 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-config\") pod \"bd538cc5-49ab-4de4-b202-9068ffe969df\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.219897 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-users\") pod \"bd538cc5-49ab-4de4-b202-9068ffe969df\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.219919 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-ca\") pod \"bd538cc5-49ab-4de4-b202-9068ffe969df\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.219942 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-credentials\") pod \"bd538cc5-49ab-4de4-b202-9068ffe969df\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.219969 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-ca\") pod \"bd538cc5-49ab-4de4-b202-9068ffe969df\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.219993 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-credentials\") pod \"bd538cc5-49ab-4de4-b202-9068ffe969df\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.220731 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rdzr\" (UniqueName: \"kubernetes.io/projected/bd538cc5-49ab-4de4-b202-9068ffe969df-kube-api-access-5rdzr\") pod \"bd538cc5-49ab-4de4-b202-9068ffe969df\" (UID: \"bd538cc5-49ab-4de4-b202-9068ffe969df\") " Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.220986 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "bd538cc5-49ab-4de4-b202-9068ffe969df" (UID: "bd538cc5-49ab-4de4-b202-9068ffe969df"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.225284 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "bd538cc5-49ab-4de4-b202-9068ffe969df" (UID: "bd538cc5-49ab-4de4-b202-9068ffe969df"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.226009 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "bd538cc5-49ab-4de4-b202-9068ffe969df" (UID: "bd538cc5-49ab-4de4-b202-9068ffe969df"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.242483 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "bd538cc5-49ab-4de4-b202-9068ffe969df" (UID: "bd538cc5-49ab-4de4-b202-9068ffe969df"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.242587 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "bd538cc5-49ab-4de4-b202-9068ffe969df" (UID: "bd538cc5-49ab-4de4-b202-9068ffe969df"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.242640 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "bd538cc5-49ab-4de4-b202-9068ffe969df" (UID: "bd538cc5-49ab-4de4-b202-9068ffe969df"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.243074 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd538cc5-49ab-4de4-b202-9068ffe969df-kube-api-access-5rdzr" (OuterVolumeSpecName: "kube-api-access-5rdzr") pod "bd538cc5-49ab-4de4-b202-9068ffe969df" (UID: "bd538cc5-49ab-4de4-b202-9068ffe969df"). InnerVolumeSpecName "kube-api-access-5rdzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.322729 4994 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.322767 4994 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-sasl-users\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.322778 4994 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.322791 4994 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.322803 4994 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.322812 4994 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/bd538cc5-49ab-4de4-b202-9068ffe969df-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.322824 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rdzr\" (UniqueName: \"kubernetes.io/projected/bd538cc5-49ab-4de4-b202-9068ffe969df-kube-api-access-5rdzr\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.347041 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-2p86s"] Mar 10 00:34:57 crc kubenswrapper[4994]: E0310 00:34:57.347468 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd538cc5-49ab-4de4-b202-9068ffe969df" containerName="default-interconnect" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.347484 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd538cc5-49ab-4de4-b202-9068ffe969df" containerName="default-interconnect" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.347593 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd538cc5-49ab-4de4-b202-9068ffe969df" containerName="default-interconnect" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.348009 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.364149 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-2p86s"] Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.424660 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.424731 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-sasl-users\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.424755 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-sasl-config\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.424792 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.424832 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.425101 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.425233 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p4k9\" (UniqueName: \"kubernetes.io/projected/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-kube-api-access-7p4k9\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.468846 4994 generic.go:334] "Generic (PLEG): container finished" podID="c50e95d0-ef8f-4355-ba85-156de14a4408" containerID="9acae11a5f98bb0e9dd9df4d9e9005d260dc85f7f3616aa89360f47e34d6bf50" exitCode=0 Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.468902 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" event={"ID":"c50e95d0-ef8f-4355-ba85-156de14a4408","Type":"ContainerDied","Data":"9acae11a5f98bb0e9dd9df4d9e9005d260dc85f7f3616aa89360f47e34d6bf50"} Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.469427 4994 scope.go:117] "RemoveContainer" containerID="9acae11a5f98bb0e9dd9df4d9e9005d260dc85f7f3616aa89360f47e34d6bf50" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.473378 4994 generic.go:334] "Generic (PLEG): container finished" podID="bd538cc5-49ab-4de4-b202-9068ffe969df" containerID="b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186" exitCode=0 Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.473442 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.473446 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" event={"ID":"bd538cc5-49ab-4de4-b202-9068ffe969df","Type":"ContainerDied","Data":"b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186"} Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.473578 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-kmjj2" event={"ID":"bd538cc5-49ab-4de4-b202-9068ffe969df","Type":"ContainerDied","Data":"a70d05716712507b24f298fa27a033f55a9aa604b32f2cbef81d39d3a28695ec"} Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.473625 4994 scope.go:117] "RemoveContainer" containerID="b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.476361 4994 generic.go:334] "Generic (PLEG): container finished" podID="ba63838d-f012-4322-afa9-d46cb2387ae8" containerID="e6068625af9c2045ef46c0df0c506de4bbc03d0c2d17fa5809ae88cbfbacc695" exitCode=0 Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.476420 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" event={"ID":"ba63838d-f012-4322-afa9-d46cb2387ae8","Type":"ContainerDied","Data":"e6068625af9c2045ef46c0df0c506de4bbc03d0c2d17fa5809ae88cbfbacc695"} Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.476930 4994 scope.go:117] "RemoveContainer" containerID="e6068625af9c2045ef46c0df0c506de4bbc03d0c2d17fa5809ae88cbfbacc695" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.493527 4994 generic.go:334] "Generic (PLEG): container finished" podID="71658d15-ee94-436a-8266-e6ef3680d0f0" containerID="4d5e4ee3ba96e2dd90d73b0be0ac50c0ec3a639d8e271a43ada0f3e3ebd65d64" exitCode=0 Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.493582 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" event={"ID":"71658d15-ee94-436a-8266-e6ef3680d0f0","Type":"ContainerDied","Data":"4d5e4ee3ba96e2dd90d73b0be0ac50c0ec3a639d8e271a43ada0f3e3ebd65d64"} Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.495071 4994 scope.go:117] "RemoveContainer" containerID="4d5e4ee3ba96e2dd90d73b0be0ac50c0ec3a639d8e271a43ada0f3e3ebd65d64" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.495127 4994 generic.go:334] "Generic (PLEG): container finished" podID="945b9546-8486-448b-a7b8-ec76634ff030" containerID="c84531a4e93d62654ebb196f19936778850438953ec8673d6fafbb0edd8785a6" exitCode=0 Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.495175 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" event={"ID":"945b9546-8486-448b-a7b8-ec76634ff030","Type":"ContainerDied","Data":"c84531a4e93d62654ebb196f19936778850438953ec8673d6fafbb0edd8785a6"} Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.495604 4994 scope.go:117] "RemoveContainer" containerID="c84531a4e93d62654ebb196f19936778850438953ec8673d6fafbb0edd8785a6" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.505251 4994 scope.go:117] "RemoveContainer" containerID="b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.509421 4994 generic.go:334] "Generic (PLEG): container finished" podID="eb6db775-6577-4ad6-90db-1fffd09b924b" containerID="8b70a93015d9609790745f53b72f74b1cd43d650efee4dc67311f62ad75287ab" exitCode=0 Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.509450 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" event={"ID":"eb6db775-6577-4ad6-90db-1fffd09b924b","Type":"ContainerDied","Data":"8b70a93015d9609790745f53b72f74b1cd43d650efee4dc67311f62ad75287ab"} Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.509780 4994 scope.go:117] "RemoveContainer" containerID="8b70a93015d9609790745f53b72f74b1cd43d650efee4dc67311f62ad75287ab" Mar 10 00:34:57 crc kubenswrapper[4994]: E0310 00:34:57.509982 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186\": container with ID starting with b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186 not found: ID does not exist" containerID="b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.510027 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186"} err="failed to get container status \"b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186\": rpc error: code = NotFound desc = could not find container \"b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186\": container with ID starting with b84bdc1df8e3ab61716696a185380fe1e0fb5570c1fd81669220638acd7a8186 not found: ID does not exist" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.524730 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-kmjj2"] Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.527038 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p4k9\" (UniqueName: \"kubernetes.io/projected/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-kube-api-access-7p4k9\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.527120 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.527140 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-sasl-users\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.527154 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-sasl-config\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.528049 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.530650 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-kmjj2"] Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.531102 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-sasl-config\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.531371 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.531761 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.536519 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.543659 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-sasl-users\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.548573 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.549030 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.561054 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.563813 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p4k9\" (UniqueName: \"kubernetes.io/projected/732a0e77-64b0-4fd5-a2b8-05cb2f80de80-kube-api-access-7p4k9\") pod \"default-interconnect-68864d46cb-2p86s\" (UID: \"732a0e77-64b0-4fd5-a2b8-05cb2f80de80\") " pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.669498 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-2p86s" Mar 10 00:34:57 crc kubenswrapper[4994]: I0310 00:34:57.964465 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-2p86s"] Mar 10 00:34:58 crc kubenswrapper[4994]: I0310 00:34:58.521565 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" event={"ID":"c50e95d0-ef8f-4355-ba85-156de14a4408","Type":"ContainerStarted","Data":"6d58b9675a43308db681361c0bef9b74160a0c0119ed01f9c003d6f031166ed0"} Mar 10 00:34:58 crc kubenswrapper[4994]: I0310 00:34:58.529411 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" event={"ID":"ba63838d-f012-4322-afa9-d46cb2387ae8","Type":"ContainerStarted","Data":"169ddcb6a7b00ea999678151450f0db053d745d229a33e2dd89700bc08cf9670"} Mar 10 00:34:58 crc kubenswrapper[4994]: I0310 00:34:58.537428 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" event={"ID":"71658d15-ee94-436a-8266-e6ef3680d0f0","Type":"ContainerStarted","Data":"fdf59e41141258a1b8472c2d3611bcc044fc9d05558c10266949e1896f729f90"} Mar 10 00:34:58 crc kubenswrapper[4994]: I0310 00:34:58.543566 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-2p86s" event={"ID":"732a0e77-64b0-4fd5-a2b8-05cb2f80de80","Type":"ContainerStarted","Data":"de2b49af68890bf8c0ee072ca9bdd53673b16a063f8cf78488b2543c00a5e5ed"} Mar 10 00:34:58 crc kubenswrapper[4994]: I0310 00:34:58.543598 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-2p86s" event={"ID":"732a0e77-64b0-4fd5-a2b8-05cb2f80de80","Type":"ContainerStarted","Data":"426ab4b74ec294b316f67ef3bb8f570755b1df95ac09ff80248990a8b46ed3ea"} Mar 10 00:34:58 crc kubenswrapper[4994]: I0310 00:34:58.548578 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" event={"ID":"945b9546-8486-448b-a7b8-ec76634ff030","Type":"ContainerStarted","Data":"aa86ab0a88d44c9621b517e017ee873893234cfa280d568b1f4abc59862b5cb2"} Mar 10 00:34:58 crc kubenswrapper[4994]: I0310 00:34:58.552021 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" event={"ID":"eb6db775-6577-4ad6-90db-1fffd09b924b","Type":"ContainerStarted","Data":"f7d20e00b2e0ce7b14482f30354361fd2e1ca73ef829a0328bc5831de70cc576"} Mar 10 00:34:58 crc kubenswrapper[4994]: I0310 00:34:58.564513 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd538cc5-49ab-4de4-b202-9068ffe969df" path="/var/lib/kubelet/pods/bd538cc5-49ab-4de4-b202-9068ffe969df/volumes" Mar 10 00:34:58 crc kubenswrapper[4994]: I0310 00:34:58.612672 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-2p86s" podStartSLOduration=2.612644263 podStartE2EDuration="2.612644263s" podCreationTimestamp="2026-03-10 00:34:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:34:58.591479052 +0000 UTC m=+1712.765185811" watchObservedRunningTime="2026-03-10 00:34:58.612644263 +0000 UTC m=+1712.786351022" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.570656 4994 generic.go:334] "Generic (PLEG): container finished" podID="71658d15-ee94-436a-8266-e6ef3680d0f0" containerID="fdf59e41141258a1b8472c2d3611bcc044fc9d05558c10266949e1896f729f90" exitCode=0 Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.571055 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" event={"ID":"71658d15-ee94-436a-8266-e6ef3680d0f0","Type":"ContainerDied","Data":"fdf59e41141258a1b8472c2d3611bcc044fc9d05558c10266949e1896f729f90"} Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.571100 4994 scope.go:117] "RemoveContainer" containerID="4d5e4ee3ba96e2dd90d73b0be0ac50c0ec3a639d8e271a43ada0f3e3ebd65d64" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.571693 4994 scope.go:117] "RemoveContainer" containerID="fdf59e41141258a1b8472c2d3611bcc044fc9d05558c10266949e1896f729f90" Mar 10 00:34:59 crc kubenswrapper[4994]: E0310 00:34:59.571959 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn_service-telemetry(71658d15-ee94-436a-8266-e6ef3680d0f0)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" podUID="71658d15-ee94-436a-8266-e6ef3680d0f0" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.577208 4994 generic.go:334] "Generic (PLEG): container finished" podID="945b9546-8486-448b-a7b8-ec76634ff030" containerID="aa86ab0a88d44c9621b517e017ee873893234cfa280d568b1f4abc59862b5cb2" exitCode=0 Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.577281 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" event={"ID":"945b9546-8486-448b-a7b8-ec76634ff030","Type":"ContainerDied","Data":"aa86ab0a88d44c9621b517e017ee873893234cfa280d568b1f4abc59862b5cb2"} Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.577822 4994 scope.go:117] "RemoveContainer" containerID="aa86ab0a88d44c9621b517e017ee873893234cfa280d568b1f4abc59862b5cb2" Mar 10 00:34:59 crc kubenswrapper[4994]: E0310 00:34:59.578076 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-b85695595-kd89k_service-telemetry(945b9546-8486-448b-a7b8-ec76634ff030)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" podUID="945b9546-8486-448b-a7b8-ec76634ff030" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.582308 4994 generic.go:334] "Generic (PLEG): container finished" podID="eb6db775-6577-4ad6-90db-1fffd09b924b" containerID="f7d20e00b2e0ce7b14482f30354361fd2e1ca73ef829a0328bc5831de70cc576" exitCode=0 Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.582363 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" event={"ID":"eb6db775-6577-4ad6-90db-1fffd09b924b","Type":"ContainerDied","Data":"f7d20e00b2e0ce7b14482f30354361fd2e1ca73ef829a0328bc5831de70cc576"} Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.582791 4994 scope.go:117] "RemoveContainer" containerID="f7d20e00b2e0ce7b14482f30354361fd2e1ca73ef829a0328bc5831de70cc576" Mar 10 00:34:59 crc kubenswrapper[4994]: E0310 00:34:59.583151 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl_service-telemetry(eb6db775-6577-4ad6-90db-1fffd09b924b)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" podUID="eb6db775-6577-4ad6-90db-1fffd09b924b" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.603093 4994 generic.go:334] "Generic (PLEG): container finished" podID="c50e95d0-ef8f-4355-ba85-156de14a4408" containerID="6d58b9675a43308db681361c0bef9b74160a0c0119ed01f9c003d6f031166ed0" exitCode=0 Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.603168 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" event={"ID":"c50e95d0-ef8f-4355-ba85-156de14a4408","Type":"ContainerDied","Data":"6d58b9675a43308db681361c0bef9b74160a0c0119ed01f9c003d6f031166ed0"} Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.603970 4994 scope.go:117] "RemoveContainer" containerID="6d58b9675a43308db681361c0bef9b74160a0c0119ed01f9c003d6f031166ed0" Mar 10 00:34:59 crc kubenswrapper[4994]: E0310 00:34:59.604288 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6_service-telemetry(c50e95d0-ef8f-4355-ba85-156de14a4408)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" podUID="c50e95d0-ef8f-4355-ba85-156de14a4408" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.625423 4994 generic.go:334] "Generic (PLEG): container finished" podID="ba63838d-f012-4322-afa9-d46cb2387ae8" containerID="169ddcb6a7b00ea999678151450f0db053d745d229a33e2dd89700bc08cf9670" exitCode=0 Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.626101 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" event={"ID":"ba63838d-f012-4322-afa9-d46cb2387ae8","Type":"ContainerDied","Data":"169ddcb6a7b00ea999678151450f0db053d745d229a33e2dd89700bc08cf9670"} Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.626351 4994 scope.go:117] "RemoveContainer" containerID="169ddcb6a7b00ea999678151450f0db053d745d229a33e2dd89700bc08cf9670" Mar 10 00:34:59 crc kubenswrapper[4994]: E0310 00:34:59.626502 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl_service-telemetry(ba63838d-f012-4322-afa9-d46cb2387ae8)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" podUID="ba63838d-f012-4322-afa9-d46cb2387ae8" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.631353 4994 scope.go:117] "RemoveContainer" containerID="c84531a4e93d62654ebb196f19936778850438953ec8673d6fafbb0edd8785a6" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.684464 4994 scope.go:117] "RemoveContainer" containerID="8b70a93015d9609790745f53b72f74b1cd43d650efee4dc67311f62ad75287ab" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.736900 4994 scope.go:117] "RemoveContainer" containerID="9acae11a5f98bb0e9dd9df4d9e9005d260dc85f7f3616aa89360f47e34d6bf50" Mar 10 00:34:59 crc kubenswrapper[4994]: I0310 00:34:59.766466 4994 scope.go:117] "RemoveContainer" containerID="e6068625af9c2045ef46c0df0c506de4bbc03d0c2d17fa5809ae88cbfbacc695" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.343520 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.344620 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.347113 4994 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.348172 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.350549 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.387552 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlst6\" (UniqueName: \"kubernetes.io/projected/f445e0d9-75b9-47ec-b98b-9e881b7f1856-kube-api-access-vlst6\") pod \"qdr-test\" (UID: \"f445e0d9-75b9-47ec-b98b-9e881b7f1856\") " pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.387604 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/f445e0d9-75b9-47ec-b98b-9e881b7f1856-qdr-test-config\") pod \"qdr-test\" (UID: \"f445e0d9-75b9-47ec-b98b-9e881b7f1856\") " pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.387722 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/f445e0d9-75b9-47ec-b98b-9e881b7f1856-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"f445e0d9-75b9-47ec-b98b-9e881b7f1856\") " pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.489330 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlst6\" (UniqueName: \"kubernetes.io/projected/f445e0d9-75b9-47ec-b98b-9e881b7f1856-kube-api-access-vlst6\") pod \"qdr-test\" (UID: \"f445e0d9-75b9-47ec-b98b-9e881b7f1856\") " pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.489386 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/f445e0d9-75b9-47ec-b98b-9e881b7f1856-qdr-test-config\") pod \"qdr-test\" (UID: \"f445e0d9-75b9-47ec-b98b-9e881b7f1856\") " pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.489464 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/f445e0d9-75b9-47ec-b98b-9e881b7f1856-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"f445e0d9-75b9-47ec-b98b-9e881b7f1856\") " pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.490213 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/f445e0d9-75b9-47ec-b98b-9e881b7f1856-qdr-test-config\") pod \"qdr-test\" (UID: \"f445e0d9-75b9-47ec-b98b-9e881b7f1856\") " pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.494866 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/f445e0d9-75b9-47ec-b98b-9e881b7f1856-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"f445e0d9-75b9-47ec-b98b-9e881b7f1856\") " pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.504449 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlst6\" (UniqueName: \"kubernetes.io/projected/f445e0d9-75b9-47ec-b98b-9e881b7f1856-kube-api-access-vlst6\") pod \"qdr-test\" (UID: \"f445e0d9-75b9-47ec-b98b-9e881b7f1856\") " pod="service-telemetry/qdr-test" Mar 10 00:35:00 crc kubenswrapper[4994]: I0310 00:35:00.657116 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 10 00:35:01 crc kubenswrapper[4994]: I0310 00:35:01.132566 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 10 00:35:01 crc kubenswrapper[4994]: I0310 00:35:01.656398 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"f445e0d9-75b9-47ec-b98b-9e881b7f1856","Type":"ContainerStarted","Data":"b419da21cbd4c826d10d500abdec14efec0593b7527ab9954c764986b1cdd160"} Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.188622 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rzlg8"] Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.190570 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.217621 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzlg8"] Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.312854 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-catalog-content\") pod \"redhat-operators-rzlg8\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.312947 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-utilities\") pod \"redhat-operators-rzlg8\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.312971 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvbb6\" (UniqueName: \"kubernetes.io/projected/6aff907a-203f-42b1-9ecb-20ab1860a00d-kube-api-access-pvbb6\") pod \"redhat-operators-rzlg8\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.413994 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-catalog-content\") pod \"redhat-operators-rzlg8\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.414314 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-utilities\") pod \"redhat-operators-rzlg8\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.414336 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvbb6\" (UniqueName: \"kubernetes.io/projected/6aff907a-203f-42b1-9ecb-20ab1860a00d-kube-api-access-pvbb6\") pod \"redhat-operators-rzlg8\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.414603 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-catalog-content\") pod \"redhat-operators-rzlg8\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.414834 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-utilities\") pod \"redhat-operators-rzlg8\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.434055 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvbb6\" (UniqueName: \"kubernetes.io/projected/6aff907a-203f-42b1-9ecb-20ab1860a00d-kube-api-access-pvbb6\") pod \"redhat-operators-rzlg8\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:09 crc kubenswrapper[4994]: I0310 00:35:09.512730 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:10 crc kubenswrapper[4994]: I0310 00:35:10.554507 4994 scope.go:117] "RemoveContainer" containerID="aa86ab0a88d44c9621b517e017ee873893234cfa280d568b1f4abc59862b5cb2" Mar 10 00:35:11 crc kubenswrapper[4994]: I0310 00:35:11.554369 4994 scope.go:117] "RemoveContainer" containerID="f7d20e00b2e0ce7b14482f30354361fd2e1ca73ef829a0328bc5831de70cc576" Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.136184 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzlg8"] Mar 10 00:35:12 crc kubenswrapper[4994]: W0310 00:35:12.154109 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aff907a_203f_42b1_9ecb_20ab1860a00d.slice/crio-067eb1ca7a214122994392b44bb2fc025416f971156c0d5b5d1b76db28176dcb WatchSource:0}: Error finding container 067eb1ca7a214122994392b44bb2fc025416f971156c0d5b5d1b76db28176dcb: Status 404 returned error can't find the container with id 067eb1ca7a214122994392b44bb2fc025416f971156c0d5b5d1b76db28176dcb Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.553685 4994 scope.go:117] "RemoveContainer" containerID="fdf59e41141258a1b8472c2d3611bcc044fc9d05558c10266949e1896f729f90" Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.554255 4994 scope.go:117] "RemoveContainer" containerID="6d58b9675a43308db681361c0bef9b74160a0c0119ed01f9c003d6f031166ed0" Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.744905 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-b85695595-kd89k" event={"ID":"945b9546-8486-448b-a7b8-ec76634ff030","Type":"ContainerStarted","Data":"399699caec1e522bdeb077a5d07c27a9eea106dc4d93a5ba9e97d0bbd5978ba1"} Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.751910 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl" event={"ID":"eb6db775-6577-4ad6-90db-1fffd09b924b","Type":"ContainerStarted","Data":"326aebfc04a1e52598b39914c5e7f78c2742d4b50b3bee1bf930ce58eadb9a8b"} Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.753526 4994 generic.go:334] "Generic (PLEG): container finished" podID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerID="f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548" exitCode=0 Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.753608 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzlg8" event={"ID":"6aff907a-203f-42b1-9ecb-20ab1860a00d","Type":"ContainerDied","Data":"f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548"} Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.753631 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzlg8" event={"ID":"6aff907a-203f-42b1-9ecb-20ab1860a00d","Type":"ContainerStarted","Data":"067eb1ca7a214122994392b44bb2fc025416f971156c0d5b5d1b76db28176dcb"} Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.758004 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"f445e0d9-75b9-47ec-b98b-9e881b7f1856","Type":"ContainerStarted","Data":"5884f0e424445fcf01622c118482f0c4fc472cf12808b855b40b5afcbe8d1cb3"} Mar 10 00:35:12 crc kubenswrapper[4994]: I0310 00:35:12.816842 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=2.170389696 podStartE2EDuration="12.816805365s" podCreationTimestamp="2026-03-10 00:35:00 +0000 UTC" firstStartedPulling="2026-03-10 00:35:01.140942311 +0000 UTC m=+1715.314649060" lastFinishedPulling="2026-03-10 00:35:11.78735797 +0000 UTC m=+1725.961064729" observedRunningTime="2026-03-10 00:35:12.81661551 +0000 UTC m=+1726.990322269" watchObservedRunningTime="2026-03-10 00:35:12.816805365 +0000 UTC m=+1726.990512104" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.110115 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-j8jlg"] Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.111331 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.114436 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.114556 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.114683 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.114741 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.114799 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.114923 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.132665 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-j8jlg"] Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.200339 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.200442 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-sensubility-config\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.200490 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-publisher\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.200556 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-config\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.200657 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-healthcheck-log\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.200763 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45wc7\" (UniqueName: \"kubernetes.io/projected/6b5caccf-955c-4075-b043-7f1bea611f1e-kube-api-access-45wc7\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.200863 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.302758 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-publisher\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.303369 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-config\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.303391 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-healthcheck-log\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.303412 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45wc7\" (UniqueName: \"kubernetes.io/projected/6b5caccf-955c-4075-b043-7f1bea611f1e-kube-api-access-45wc7\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.303438 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.303494 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.303533 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-sensubility-config\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.304828 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-sensubility-config\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.305565 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-publisher\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.306240 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-config\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.306832 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-healthcheck-log\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.307985 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.308616 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.361543 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45wc7\" (UniqueName: \"kubernetes.io/projected/6b5caccf-955c-4075-b043-7f1bea611f1e-kube-api-access-45wc7\") pod \"stf-smoketest-smoke1-j8jlg\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.411506 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.412908 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.419076 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.428153 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.507067 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-262z7\" (UniqueName: \"kubernetes.io/projected/71067204-bbe4-4fcc-9fb0-6a349363479a-kube-api-access-262z7\") pod \"curl\" (UID: \"71067204-bbe4-4fcc-9fb0-6a349363479a\") " pod="service-telemetry/curl" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.553973 4994 scope.go:117] "RemoveContainer" containerID="169ddcb6a7b00ea999678151450f0db053d745d229a33e2dd89700bc08cf9670" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.608307 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-262z7\" (UniqueName: \"kubernetes.io/projected/71067204-bbe4-4fcc-9fb0-6a349363479a-kube-api-access-262z7\") pod \"curl\" (UID: \"71067204-bbe4-4fcc-9fb0-6a349363479a\") " pod="service-telemetry/curl" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.630612 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-262z7\" (UniqueName: \"kubernetes.io/projected/71067204-bbe4-4fcc-9fb0-6a349363479a-kube-api-access-262z7\") pod \"curl\" (UID: \"71067204-bbe4-4fcc-9fb0-6a349363479a\") " pod="service-telemetry/curl" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.735175 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-j8jlg"] Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.739472 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.782719 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn" event={"ID":"71658d15-ee94-436a-8266-e6ef3680d0f0","Type":"ContainerStarted","Data":"9c04e4299610c8d0304f27bd7be11cb95958733395d98aa4af3fca9a79d6c6f7"} Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.793994 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6" event={"ID":"c50e95d0-ef8f-4355-ba85-156de14a4408","Type":"ContainerStarted","Data":"66ec1d9568ec573d582b0b8dbddb66e09c796d92044c36ab0c4f1ee9b447d358"} Mar 10 00:35:13 crc kubenswrapper[4994]: I0310 00:35:13.795901 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" event={"ID":"6b5caccf-955c-4075-b043-7f1bea611f1e","Type":"ContainerStarted","Data":"2d98b06a95f6259f9b2d744815dd5dc10490cb14f9c60bcb23cba61992b0701f"} Mar 10 00:35:14 crc kubenswrapper[4994]: I0310 00:35:14.193569 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 10 00:35:14 crc kubenswrapper[4994]: I0310 00:35:14.818897 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl" event={"ID":"ba63838d-f012-4322-afa9-d46cb2387ae8","Type":"ContainerStarted","Data":"20fe6fe24015dd6c61a3c23e21dfce6af7eff722f2267f212e0d75bd1acfc6e3"} Mar 10 00:35:14 crc kubenswrapper[4994]: I0310 00:35:14.822623 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"71067204-bbe4-4fcc-9fb0-6a349363479a","Type":"ContainerStarted","Data":"3b29bec2ca18d264db1f11e8f6eb788920224123906afbb2881a52bc48dd332f"} Mar 10 00:35:14 crc kubenswrapper[4994]: I0310 00:35:14.826327 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzlg8" event={"ID":"6aff907a-203f-42b1-9ecb-20ab1860a00d","Type":"ContainerStarted","Data":"7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb"} Mar 10 00:35:16 crc kubenswrapper[4994]: I0310 00:35:16.845699 4994 generic.go:334] "Generic (PLEG): container finished" podID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerID="7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb" exitCode=0 Mar 10 00:35:16 crc kubenswrapper[4994]: I0310 00:35:16.845784 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzlg8" event={"ID":"6aff907a-203f-42b1-9ecb-20ab1860a00d","Type":"ContainerDied","Data":"7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb"} Mar 10 00:35:18 crc kubenswrapper[4994]: I0310 00:35:18.892812 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:35:18 crc kubenswrapper[4994]: I0310 00:35:18.893226 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:35:25 crc kubenswrapper[4994]: I0310 00:35:25.916578 4994 generic.go:334] "Generic (PLEG): container finished" podID="71067204-bbe4-4fcc-9fb0-6a349363479a" containerID="6d8fe1647417184e8e25c69d6debae961f810ea3d0dc03dd5ec0460f1f287477" exitCode=0 Mar 10 00:35:25 crc kubenswrapper[4994]: I0310 00:35:25.916698 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"71067204-bbe4-4fcc-9fb0-6a349363479a","Type":"ContainerDied","Data":"6d8fe1647417184e8e25c69d6debae961f810ea3d0dc03dd5ec0460f1f287477"} Mar 10 00:35:25 crc kubenswrapper[4994]: I0310 00:35:25.918948 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzlg8" event={"ID":"6aff907a-203f-42b1-9ecb-20ab1860a00d","Type":"ContainerStarted","Data":"963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d"} Mar 10 00:35:25 crc kubenswrapper[4994]: I0310 00:35:25.920843 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" event={"ID":"6b5caccf-955c-4075-b043-7f1bea611f1e","Type":"ContainerStarted","Data":"d9ce53eb5bc559e2a48e954e260519cec3def3db573dfb9cad96b1f81692f09f"} Mar 10 00:35:25 crc kubenswrapper[4994]: I0310 00:35:25.960797 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rzlg8" podStartSLOduration=4.437664607 podStartE2EDuration="16.960777846s" podCreationTimestamp="2026-03-10 00:35:09 +0000 UTC" firstStartedPulling="2026-03-10 00:35:12.756002209 +0000 UTC m=+1726.929708958" lastFinishedPulling="2026-03-10 00:35:25.279115408 +0000 UTC m=+1739.452822197" observedRunningTime="2026-03-10 00:35:25.954472944 +0000 UTC m=+1740.128179693" watchObservedRunningTime="2026-03-10 00:35:25.960777846 +0000 UTC m=+1740.134484615" Mar 10 00:35:27 crc kubenswrapper[4994]: I0310 00:35:27.176131 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 10 00:35:27 crc kubenswrapper[4994]: I0310 00:35:27.307703 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-262z7\" (UniqueName: \"kubernetes.io/projected/71067204-bbe4-4fcc-9fb0-6a349363479a-kube-api-access-262z7\") pod \"71067204-bbe4-4fcc-9fb0-6a349363479a\" (UID: \"71067204-bbe4-4fcc-9fb0-6a349363479a\") " Mar 10 00:35:27 crc kubenswrapper[4994]: I0310 00:35:27.312725 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_71067204-bbe4-4fcc-9fb0-6a349363479a/curl/0.log" Mar 10 00:35:27 crc kubenswrapper[4994]: I0310 00:35:27.336711 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71067204-bbe4-4fcc-9fb0-6a349363479a-kube-api-access-262z7" (OuterVolumeSpecName: "kube-api-access-262z7") pod "71067204-bbe4-4fcc-9fb0-6a349363479a" (UID: "71067204-bbe4-4fcc-9fb0-6a349363479a"). InnerVolumeSpecName "kube-api-access-262z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:35:27 crc kubenswrapper[4994]: I0310 00:35:27.409403 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-262z7\" (UniqueName: \"kubernetes.io/projected/71067204-bbe4-4fcc-9fb0-6a349363479a-kube-api-access-262z7\") on node \"crc\" DevicePath \"\"" Mar 10 00:35:27 crc kubenswrapper[4994]: I0310 00:35:27.575959 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-fzqv5_8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b/prometheus-webhook-snmp/0.log" Mar 10 00:35:27 crc kubenswrapper[4994]: I0310 00:35:27.933554 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"71067204-bbe4-4fcc-9fb0-6a349363479a","Type":"ContainerDied","Data":"3b29bec2ca18d264db1f11e8f6eb788920224123906afbb2881a52bc48dd332f"} Mar 10 00:35:27 crc kubenswrapper[4994]: I0310 00:35:27.933590 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b29bec2ca18d264db1f11e8f6eb788920224123906afbb2881a52bc48dd332f" Mar 10 00:35:27 crc kubenswrapper[4994]: I0310 00:35:27.933649 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 10 00:35:29 crc kubenswrapper[4994]: I0310 00:35:29.513232 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:29 crc kubenswrapper[4994]: I0310 00:35:29.513502 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:35:30 crc kubenswrapper[4994]: I0310 00:35:30.551602 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rzlg8" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="registry-server" probeResult="failure" output=< Mar 10 00:35:30 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:35:30 crc kubenswrapper[4994]: > Mar 10 00:35:32 crc kubenswrapper[4994]: I0310 00:35:32.977241 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" event={"ID":"6b5caccf-955c-4075-b043-7f1bea611f1e","Type":"ContainerStarted","Data":"17d0bcd92de3ad324c7f89758949a37e1f93ee189f044b9ef739243d90393ed8"} Mar 10 00:35:33 crc kubenswrapper[4994]: I0310 00:35:33.013809 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" podStartSLOduration=1.539929115 podStartE2EDuration="20.013782521s" podCreationTimestamp="2026-03-10 00:35:13 +0000 UTC" firstStartedPulling="2026-03-10 00:35:13.75400552 +0000 UTC m=+1727.927712269" lastFinishedPulling="2026-03-10 00:35:32.227858926 +0000 UTC m=+1746.401565675" observedRunningTime="2026-03-10 00:35:33.011100852 +0000 UTC m=+1747.184807621" watchObservedRunningTime="2026-03-10 00:35:33.013782521 +0000 UTC m=+1747.187489310" Mar 10 00:35:40 crc kubenswrapper[4994]: I0310 00:35:40.579277 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rzlg8" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="registry-server" probeResult="failure" output=< Mar 10 00:35:40 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:35:40 crc kubenswrapper[4994]: > Mar 10 00:35:48 crc kubenswrapper[4994]: I0310 00:35:48.893959 4994 patch_prober.go:28] interesting pod/machine-config-daemon-kfljj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:35:48 crc kubenswrapper[4994]: I0310 00:35:48.894672 4994 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:35:48 crc kubenswrapper[4994]: I0310 00:35:48.894747 4994 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" Mar 10 00:35:48 crc kubenswrapper[4994]: I0310 00:35:48.897505 4994 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b"} pod="openshift-machine-config-operator/machine-config-daemon-kfljj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:35:48 crc kubenswrapper[4994]: I0310 00:35:48.897645 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" containerName="machine-config-daemon" containerID="cri-o://39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" gracePeriod=600 Mar 10 00:35:49 crc kubenswrapper[4994]: E0310 00:35:49.533386 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:35:50 crc kubenswrapper[4994]: I0310 00:35:50.122011 4994 generic.go:334] "Generic (PLEG): container finished" podID="ced5d66d-39df-4267-b801-e1e60d517ace" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" exitCode=0 Mar 10 00:35:50 crc kubenswrapper[4994]: I0310 00:35:50.122047 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerDied","Data":"39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b"} Mar 10 00:35:50 crc kubenswrapper[4994]: I0310 00:35:50.122075 4994 scope.go:117] "RemoveContainer" containerID="d8d935625d60ec1fe79acd428aa0c427cb2a184ba8e0a37f25ea8bb9485e5629" Mar 10 00:35:50 crc kubenswrapper[4994]: I0310 00:35:50.122534 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:35:50 crc kubenswrapper[4994]: E0310 00:35:50.122741 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:35:50 crc kubenswrapper[4994]: I0310 00:35:50.581277 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rzlg8" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="registry-server" probeResult="failure" output=< Mar 10 00:35:50 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:35:50 crc kubenswrapper[4994]: > Mar 10 00:35:57 crc kubenswrapper[4994]: I0310 00:35:57.703821 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-fzqv5_8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b/prometheus-webhook-snmp/0.log" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.149064 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551716-7pfgn"] Mar 10 00:36:00 crc kubenswrapper[4994]: E0310 00:36:00.150334 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71067204-bbe4-4fcc-9fb0-6a349363479a" containerName="curl" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.150437 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="71067204-bbe4-4fcc-9fb0-6a349363479a" containerName="curl" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.150655 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="71067204-bbe4-4fcc-9fb0-6a349363479a" containerName="curl" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.151311 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551716-7pfgn" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.153565 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.153956 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.154191 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.185146 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551716-7pfgn"] Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.217391 4994 generic.go:334] "Generic (PLEG): container finished" podID="6b5caccf-955c-4075-b043-7f1bea611f1e" containerID="d9ce53eb5bc559e2a48e954e260519cec3def3db573dfb9cad96b1f81692f09f" exitCode=0 Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.217479 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" event={"ID":"6b5caccf-955c-4075-b043-7f1bea611f1e","Type":"ContainerDied","Data":"d9ce53eb5bc559e2a48e954e260519cec3def3db573dfb9cad96b1f81692f09f"} Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.218511 4994 scope.go:117] "RemoveContainer" containerID="d9ce53eb5bc559e2a48e954e260519cec3def3db573dfb9cad96b1f81692f09f" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.288783 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sqnl\" (UniqueName: \"kubernetes.io/projected/de54238c-3c16-4557-aaa0-fb321dc61ca7-kube-api-access-9sqnl\") pod \"auto-csr-approver-29551716-7pfgn\" (UID: \"de54238c-3c16-4557-aaa0-fb321dc61ca7\") " pod="openshift-infra/auto-csr-approver-29551716-7pfgn" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.391704 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sqnl\" (UniqueName: \"kubernetes.io/projected/de54238c-3c16-4557-aaa0-fb321dc61ca7-kube-api-access-9sqnl\") pod \"auto-csr-approver-29551716-7pfgn\" (UID: \"de54238c-3c16-4557-aaa0-fb321dc61ca7\") " pod="openshift-infra/auto-csr-approver-29551716-7pfgn" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.434206 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sqnl\" (UniqueName: \"kubernetes.io/projected/de54238c-3c16-4557-aaa0-fb321dc61ca7-kube-api-access-9sqnl\") pod \"auto-csr-approver-29551716-7pfgn\" (UID: \"de54238c-3c16-4557-aaa0-fb321dc61ca7\") " pod="openshift-infra/auto-csr-approver-29551716-7pfgn" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.476171 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551716-7pfgn" Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.569948 4994 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rzlg8" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="registry-server" probeResult="failure" output=< Mar 10 00:36:00 crc kubenswrapper[4994]: timeout: failed to connect service ":50051" within 1s Mar 10 00:36:00 crc kubenswrapper[4994]: > Mar 10 00:36:00 crc kubenswrapper[4994]: I0310 00:36:00.712028 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551716-7pfgn"] Mar 10 00:36:01 crc kubenswrapper[4994]: I0310 00:36:01.237750 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551716-7pfgn" event={"ID":"de54238c-3c16-4557-aaa0-fb321dc61ca7","Type":"ContainerStarted","Data":"286423bcf633f2c2ee39746fd2fc31ea8582414925a75e045afa246fd899accb"} Mar 10 00:36:01 crc kubenswrapper[4994]: I0310 00:36:01.554264 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:36:01 crc kubenswrapper[4994]: E0310 00:36:01.554633 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:36:02 crc kubenswrapper[4994]: I0310 00:36:02.246626 4994 generic.go:334] "Generic (PLEG): container finished" podID="de54238c-3c16-4557-aaa0-fb321dc61ca7" containerID="c123e634ccea8da5ca365371ebfd215620629bd57b4fb38743d94a62179c0683" exitCode=0 Mar 10 00:36:02 crc kubenswrapper[4994]: I0310 00:36:02.246718 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551716-7pfgn" event={"ID":"de54238c-3c16-4557-aaa0-fb321dc61ca7","Type":"ContainerDied","Data":"c123e634ccea8da5ca365371ebfd215620629bd57b4fb38743d94a62179c0683"} Mar 10 00:36:03 crc kubenswrapper[4994]: I0310 00:36:03.556955 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551716-7pfgn" Mar 10 00:36:03 crc kubenswrapper[4994]: I0310 00:36:03.652102 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sqnl\" (UniqueName: \"kubernetes.io/projected/de54238c-3c16-4557-aaa0-fb321dc61ca7-kube-api-access-9sqnl\") pod \"de54238c-3c16-4557-aaa0-fb321dc61ca7\" (UID: \"de54238c-3c16-4557-aaa0-fb321dc61ca7\") " Mar 10 00:36:03 crc kubenswrapper[4994]: I0310 00:36:03.662672 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de54238c-3c16-4557-aaa0-fb321dc61ca7-kube-api-access-9sqnl" (OuterVolumeSpecName: "kube-api-access-9sqnl") pod "de54238c-3c16-4557-aaa0-fb321dc61ca7" (UID: "de54238c-3c16-4557-aaa0-fb321dc61ca7"). InnerVolumeSpecName "kube-api-access-9sqnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:36:03 crc kubenswrapper[4994]: I0310 00:36:03.754933 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sqnl\" (UniqueName: \"kubernetes.io/projected/de54238c-3c16-4557-aaa0-fb321dc61ca7-kube-api-access-9sqnl\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:04 crc kubenswrapper[4994]: I0310 00:36:04.267450 4994 generic.go:334] "Generic (PLEG): container finished" podID="6b5caccf-955c-4075-b043-7f1bea611f1e" containerID="17d0bcd92de3ad324c7f89758949a37e1f93ee189f044b9ef739243d90393ed8" exitCode=0 Mar 10 00:36:04 crc kubenswrapper[4994]: I0310 00:36:04.267579 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" event={"ID":"6b5caccf-955c-4075-b043-7f1bea611f1e","Type":"ContainerDied","Data":"17d0bcd92de3ad324c7f89758949a37e1f93ee189f044b9ef739243d90393ed8"} Mar 10 00:36:04 crc kubenswrapper[4994]: I0310 00:36:04.272206 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551716-7pfgn" event={"ID":"de54238c-3c16-4557-aaa0-fb321dc61ca7","Type":"ContainerDied","Data":"286423bcf633f2c2ee39746fd2fc31ea8582414925a75e045afa246fd899accb"} Mar 10 00:36:04 crc kubenswrapper[4994]: I0310 00:36:04.272267 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="286423bcf633f2c2ee39746fd2fc31ea8582414925a75e045afa246fd899accb" Mar 10 00:36:04 crc kubenswrapper[4994]: I0310 00:36:04.272267 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551716-7pfgn" Mar 10 00:36:04 crc kubenswrapper[4994]: I0310 00:36:04.638414 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551710-vrwfw"] Mar 10 00:36:04 crc kubenswrapper[4994]: I0310 00:36:04.646106 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551710-vrwfw"] Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.577486 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.682834 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-entrypoint-script\") pod \"6b5caccf-955c-4075-b043-7f1bea611f1e\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.682904 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-sensubility-config\") pod \"6b5caccf-955c-4075-b043-7f1bea611f1e\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.682932 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45wc7\" (UniqueName: \"kubernetes.io/projected/6b5caccf-955c-4075-b043-7f1bea611f1e-kube-api-access-45wc7\") pod \"6b5caccf-955c-4075-b043-7f1bea611f1e\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.682960 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-entrypoint-script\") pod \"6b5caccf-955c-4075-b043-7f1bea611f1e\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.682976 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-healthcheck-log\") pod \"6b5caccf-955c-4075-b043-7f1bea611f1e\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.682994 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-config\") pod \"6b5caccf-955c-4075-b043-7f1bea611f1e\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.683087 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-publisher\") pod \"6b5caccf-955c-4075-b043-7f1bea611f1e\" (UID: \"6b5caccf-955c-4075-b043-7f1bea611f1e\") " Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.704739 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "6b5caccf-955c-4075-b043-7f1bea611f1e" (UID: "6b5caccf-955c-4075-b043-7f1bea611f1e"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.705530 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "6b5caccf-955c-4075-b043-7f1bea611f1e" (UID: "6b5caccf-955c-4075-b043-7f1bea611f1e"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.705662 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b5caccf-955c-4075-b043-7f1bea611f1e-kube-api-access-45wc7" (OuterVolumeSpecName: "kube-api-access-45wc7") pod "6b5caccf-955c-4075-b043-7f1bea611f1e" (UID: "6b5caccf-955c-4075-b043-7f1bea611f1e"). InnerVolumeSpecName "kube-api-access-45wc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.710190 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "6b5caccf-955c-4075-b043-7f1bea611f1e" (UID: "6b5caccf-955c-4075-b043-7f1bea611f1e"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.712189 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "6b5caccf-955c-4075-b043-7f1bea611f1e" (UID: "6b5caccf-955c-4075-b043-7f1bea611f1e"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.721046 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "6b5caccf-955c-4075-b043-7f1bea611f1e" (UID: "6b5caccf-955c-4075-b043-7f1bea611f1e"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.726964 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "6b5caccf-955c-4075-b043-7f1bea611f1e" (UID: "6b5caccf-955c-4075-b043-7f1bea611f1e"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.784421 4994 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.784460 4994 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.784471 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45wc7\" (UniqueName: \"kubernetes.io/projected/6b5caccf-955c-4075-b043-7f1bea611f1e-kube-api-access-45wc7\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.784483 4994 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-sensubility-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.784495 4994 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.784504 4994 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-healthcheck-log\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:05 crc kubenswrapper[4994]: I0310 00:36:05.784512 4994 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/6b5caccf-955c-4075-b043-7f1bea611f1e-collectd-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:06 crc kubenswrapper[4994]: I0310 00:36:06.289107 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" event={"ID":"6b5caccf-955c-4075-b043-7f1bea611f1e","Type":"ContainerDied","Data":"2d98b06a95f6259f9b2d744815dd5dc10490cb14f9c60bcb23cba61992b0701f"} Mar 10 00:36:06 crc kubenswrapper[4994]: I0310 00:36:06.289463 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d98b06a95f6259f9b2d744815dd5dc10490cb14f9c60bcb23cba61992b0701f" Mar 10 00:36:06 crc kubenswrapper[4994]: I0310 00:36:06.289168 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-j8jlg" Mar 10 00:36:06 crc kubenswrapper[4994]: I0310 00:36:06.569024 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="907fae93-d4a7-46e8-9fab-3c964fcb52ab" path="/var/lib/kubelet/pods/907fae93-d4a7-46e8-9fab-3c964fcb52ab/volumes" Mar 10 00:36:07 crc kubenswrapper[4994]: I0310 00:36:07.671687 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-j8jlg_6b5caccf-955c-4075-b043-7f1bea611f1e/smoketest-collectd/0.log" Mar 10 00:36:07 crc kubenswrapper[4994]: I0310 00:36:07.980100 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-j8jlg_6b5caccf-955c-4075-b043-7f1bea611f1e/smoketest-ceilometer/0.log" Mar 10 00:36:08 crc kubenswrapper[4994]: I0310 00:36:08.274754 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-2p86s_732a0e77-64b0-4fd5-a2b8-05cb2f80de80/default-interconnect/0.log" Mar 10 00:36:08 crc kubenswrapper[4994]: I0310 00:36:08.552147 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl_ba63838d-f012-4322-afa9-d46cb2387ae8/bridge/2.log" Mar 10 00:36:08 crc kubenswrapper[4994]: I0310 00:36:08.782241 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-9r8gl_ba63838d-f012-4322-afa9-d46cb2387ae8/sg-core/0.log" Mar 10 00:36:09 crc kubenswrapper[4994]: I0310 00:36:09.013798 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-b85695595-kd89k_945b9546-8486-448b-a7b8-ec76634ff030/bridge/2.log" Mar 10 00:36:09 crc kubenswrapper[4994]: I0310 00:36:09.248775 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-b85695595-kd89k_945b9546-8486-448b-a7b8-ec76634ff030/sg-core/0.log" Mar 10 00:36:09 crc kubenswrapper[4994]: I0310 00:36:09.470123 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn_71658d15-ee94-436a-8266-e6ef3680d0f0/bridge/2.log" Mar 10 00:36:09 crc kubenswrapper[4994]: I0310 00:36:09.564992 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:36:09 crc kubenswrapper[4994]: I0310 00:36:09.637244 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:36:09 crc kubenswrapper[4994]: I0310 00:36:09.706140 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-tj8mn_71658d15-ee94-436a-8266-e6ef3680d0f0/sg-core/0.log" Mar 10 00:36:09 crc kubenswrapper[4994]: I0310 00:36:09.813636 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzlg8"] Mar 10 00:36:09 crc kubenswrapper[4994]: I0310 00:36:09.936766 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6_c50e95d0-ef8f-4355-ba85-156de14a4408/bridge/2.log" Mar 10 00:36:10 crc kubenswrapper[4994]: I0310 00:36:10.166239 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-74cbf6749b-vvrn6_c50e95d0-ef8f-4355-ba85-156de14a4408/sg-core/0.log" Mar 10 00:36:10 crc kubenswrapper[4994]: I0310 00:36:10.372995 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl_eb6db775-6577-4ad6-90db-1fffd09b924b/bridge/2.log" Mar 10 00:36:10 crc kubenswrapper[4994]: I0310 00:36:10.597030 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-q9pkl_eb6db775-6577-4ad6-90db-1fffd09b924b/sg-core/0.log" Mar 10 00:36:11 crc kubenswrapper[4994]: I0310 00:36:11.327534 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rzlg8" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="registry-server" containerID="cri-o://963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d" gracePeriod=2 Mar 10 00:36:11 crc kubenswrapper[4994]: I0310 00:36:11.746649 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:36:11 crc kubenswrapper[4994]: I0310 00:36:11.901960 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-utilities\") pod \"6aff907a-203f-42b1-9ecb-20ab1860a00d\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " Mar 10 00:36:11 crc kubenswrapper[4994]: I0310 00:36:11.902040 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvbb6\" (UniqueName: \"kubernetes.io/projected/6aff907a-203f-42b1-9ecb-20ab1860a00d-kube-api-access-pvbb6\") pod \"6aff907a-203f-42b1-9ecb-20ab1860a00d\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " Mar 10 00:36:11 crc kubenswrapper[4994]: I0310 00:36:11.902104 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-catalog-content\") pod \"6aff907a-203f-42b1-9ecb-20ab1860a00d\" (UID: \"6aff907a-203f-42b1-9ecb-20ab1860a00d\") " Mar 10 00:36:11 crc kubenswrapper[4994]: I0310 00:36:11.905540 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-utilities" (OuterVolumeSpecName: "utilities") pod "6aff907a-203f-42b1-9ecb-20ab1860a00d" (UID: "6aff907a-203f-42b1-9ecb-20ab1860a00d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:36:11 crc kubenswrapper[4994]: I0310 00:36:11.911059 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aff907a-203f-42b1-9ecb-20ab1860a00d-kube-api-access-pvbb6" (OuterVolumeSpecName: "kube-api-access-pvbb6") pod "6aff907a-203f-42b1-9ecb-20ab1860a00d" (UID: "6aff907a-203f-42b1-9ecb-20ab1860a00d"). InnerVolumeSpecName "kube-api-access-pvbb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.003644 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.003907 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvbb6\" (UniqueName: \"kubernetes.io/projected/6aff907a-203f-42b1-9ecb-20ab1860a00d-kube-api-access-pvbb6\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.016290 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6aff907a-203f-42b1-9ecb-20ab1860a00d" (UID: "6aff907a-203f-42b1-9ecb-20ab1860a00d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.105644 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aff907a-203f-42b1-9ecb-20ab1860a00d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.364111 4994 generic.go:334] "Generic (PLEG): container finished" podID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerID="963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d" exitCode=0 Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.364158 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzlg8" event={"ID":"6aff907a-203f-42b1-9ecb-20ab1860a00d","Type":"ContainerDied","Data":"963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d"} Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.364191 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzlg8" event={"ID":"6aff907a-203f-42b1-9ecb-20ab1860a00d","Type":"ContainerDied","Data":"067eb1ca7a214122994392b44bb2fc025416f971156c0d5b5d1b76db28176dcb"} Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.364207 4994 scope.go:117] "RemoveContainer" containerID="963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.364354 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzlg8" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.391516 4994 scope.go:117] "RemoveContainer" containerID="7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.419929 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzlg8"] Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.424917 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rzlg8"] Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.443453 4994 scope.go:117] "RemoveContainer" containerID="f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.459281 4994 scope.go:117] "RemoveContainer" containerID="963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d" Mar 10 00:36:12 crc kubenswrapper[4994]: E0310 00:36:12.459590 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d\": container with ID starting with 963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d not found: ID does not exist" containerID="963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.459617 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d"} err="failed to get container status \"963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d\": rpc error: code = NotFound desc = could not find container \"963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d\": container with ID starting with 963495221c5cb348f7155c37e4b84f3728f00f9730bbcadda2444a488b22e85d not found: ID does not exist" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.459637 4994 scope.go:117] "RemoveContainer" containerID="7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb" Mar 10 00:36:12 crc kubenswrapper[4994]: E0310 00:36:12.460000 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb\": container with ID starting with 7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb not found: ID does not exist" containerID="7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.460024 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb"} err="failed to get container status \"7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb\": rpc error: code = NotFound desc = could not find container \"7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb\": container with ID starting with 7e04057e9387ce3802b22ba067d767de074947f2550d5b30b96e68fcbca005eb not found: ID does not exist" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.460038 4994 scope.go:117] "RemoveContainer" containerID="f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548" Mar 10 00:36:12 crc kubenswrapper[4994]: E0310 00:36:12.460269 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548\": container with ID starting with f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548 not found: ID does not exist" containerID="f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.460291 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548"} err="failed to get container status \"f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548\": rpc error: code = NotFound desc = could not find container \"f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548\": container with ID starting with f08ad1d10838015d73b156b9a4043c9abbe9e13c40366461ab1d6103a5e2d548 not found: ID does not exist" Mar 10 00:36:12 crc kubenswrapper[4994]: I0310 00:36:12.563428 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" path="/var/lib/kubelet/pods/6aff907a-203f-42b1-9ecb-20ab1860a00d/volumes" Mar 10 00:36:13 crc kubenswrapper[4994]: I0310 00:36:13.825353 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-7457956966-kbwlx_0730e042-f632-4db2-a694-b5917982d77d/operator/0.log" Mar 10 00:36:14 crc kubenswrapper[4994]: I0310 00:36:14.101942 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_69942722-a3c1-459b-96d3-260e0813093b/prometheus/0.log" Mar 10 00:36:14 crc kubenswrapper[4994]: I0310 00:36:14.348370 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_4e5e8d5b-ba04-461d-b7d0-98d90dd79fd7/elasticsearch/0.log" Mar 10 00:36:14 crc kubenswrapper[4994]: I0310 00:36:14.554514 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:36:14 crc kubenswrapper[4994]: E0310 00:36:14.554862 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:36:14 crc kubenswrapper[4994]: I0310 00:36:14.630116 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-fzqv5_8e6cb6c2-b4dc-41ad-83dc-63de94ec3b6b/prometheus-webhook-snmp/0.log" Mar 10 00:36:14 crc kubenswrapper[4994]: I0310 00:36:14.871783 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_bd991a1f-d471-40c4-919f-75400e047b5d/alertmanager/0.log" Mar 10 00:36:27 crc kubenswrapper[4994]: I0310 00:36:27.555579 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:36:27 crc kubenswrapper[4994]: E0310 00:36:27.556842 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:36:31 crc kubenswrapper[4994]: I0310 00:36:31.103582 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-656df8f446-7rqn6_134b5ce4-37cf-459f-9c27-dafae8eb9e86/operator/0.log" Mar 10 00:36:32 crc kubenswrapper[4994]: I0310 00:36:32.641583 4994 scope.go:117] "RemoveContainer" containerID="dc93dce81f7d66a840274eb6f49e057db6ba425ffe2f1cac85085352655d2af7" Mar 10 00:36:34 crc kubenswrapper[4994]: I0310 00:36:34.456335 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-7457956966-kbwlx_0730e042-f632-4db2-a694-b5917982d77d/operator/0.log" Mar 10 00:36:34 crc kubenswrapper[4994]: I0310 00:36:34.780546 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_f445e0d9-75b9-47ec-b98b-9e881b7f1856/qdr/0.log" Mar 10 00:36:39 crc kubenswrapper[4994]: I0310 00:36:39.553976 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:36:39 crc kubenswrapper[4994]: E0310 00:36:39.555225 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:36:50 crc kubenswrapper[4994]: I0310 00:36:50.554498 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:36:50 crc kubenswrapper[4994]: E0310 00:36:50.555181 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.274696 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jxgzr/must-gather-78m7n"] Mar 10 00:36:58 crc kubenswrapper[4994]: E0310 00:36:58.275562 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b5caccf-955c-4075-b043-7f1bea611f1e" containerName="smoketest-collectd" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275578 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b5caccf-955c-4075-b043-7f1bea611f1e" containerName="smoketest-collectd" Mar 10 00:36:58 crc kubenswrapper[4994]: E0310 00:36:58.275598 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="extract-content" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275605 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="extract-content" Mar 10 00:36:58 crc kubenswrapper[4994]: E0310 00:36:58.275623 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="extract-utilities" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275631 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="extract-utilities" Mar 10 00:36:58 crc kubenswrapper[4994]: E0310 00:36:58.275642 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b5caccf-955c-4075-b043-7f1bea611f1e" containerName="smoketest-ceilometer" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275649 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b5caccf-955c-4075-b043-7f1bea611f1e" containerName="smoketest-ceilometer" Mar 10 00:36:58 crc kubenswrapper[4994]: E0310 00:36:58.275662 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="registry-server" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275670 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="registry-server" Mar 10 00:36:58 crc kubenswrapper[4994]: E0310 00:36:58.275689 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de54238c-3c16-4557-aaa0-fb321dc61ca7" containerName="oc" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275697 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="de54238c-3c16-4557-aaa0-fb321dc61ca7" containerName="oc" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275833 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b5caccf-955c-4075-b043-7f1bea611f1e" containerName="smoketest-ceilometer" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275851 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aff907a-203f-42b1-9ecb-20ab1860a00d" containerName="registry-server" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275863 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b5caccf-955c-4075-b043-7f1bea611f1e" containerName="smoketest-collectd" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.275891 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="de54238c-3c16-4557-aaa0-fb321dc61ca7" containerName="oc" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.276694 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.281574 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jxgzr"/"openshift-service-ca.crt" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.281796 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jxgzr"/"default-dockercfg-2vzbr" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.282422 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jxgzr"/"kube-root-ca.crt" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.289243 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jxgzr/must-gather-78m7n"] Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.338767 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4c1f251c-2be2-460a-aa78-fca33bed879f-must-gather-output\") pod \"must-gather-78m7n\" (UID: \"4c1f251c-2be2-460a-aa78-fca33bed879f\") " pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.338959 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdhc7\" (UniqueName: \"kubernetes.io/projected/4c1f251c-2be2-460a-aa78-fca33bed879f-kube-api-access-xdhc7\") pod \"must-gather-78m7n\" (UID: \"4c1f251c-2be2-460a-aa78-fca33bed879f\") " pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.440341 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4c1f251c-2be2-460a-aa78-fca33bed879f-must-gather-output\") pod \"must-gather-78m7n\" (UID: \"4c1f251c-2be2-460a-aa78-fca33bed879f\") " pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.440747 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdhc7\" (UniqueName: \"kubernetes.io/projected/4c1f251c-2be2-460a-aa78-fca33bed879f-kube-api-access-xdhc7\") pod \"must-gather-78m7n\" (UID: \"4c1f251c-2be2-460a-aa78-fca33bed879f\") " pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.441437 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4c1f251c-2be2-460a-aa78-fca33bed879f-must-gather-output\") pod \"must-gather-78m7n\" (UID: \"4c1f251c-2be2-460a-aa78-fca33bed879f\") " pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.459477 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdhc7\" (UniqueName: \"kubernetes.io/projected/4c1f251c-2be2-460a-aa78-fca33bed879f-kube-api-access-xdhc7\") pod \"must-gather-78m7n\" (UID: \"4c1f251c-2be2-460a-aa78-fca33bed879f\") " pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.594778 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:36:58 crc kubenswrapper[4994]: I0310 00:36:58.829056 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jxgzr/must-gather-78m7n"] Mar 10 00:36:59 crc kubenswrapper[4994]: I0310 00:36:59.815412 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jxgzr/must-gather-78m7n" event={"ID":"4c1f251c-2be2-460a-aa78-fca33bed879f","Type":"ContainerStarted","Data":"256c5d91a415fa84cf93382c61caac00eddcb4f0bdb42042e78a5b5005818752"} Mar 10 00:37:02 crc kubenswrapper[4994]: I0310 00:37:02.553901 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:37:02 crc kubenswrapper[4994]: E0310 00:37:02.554475 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:37:05 crc kubenswrapper[4994]: I0310 00:37:05.862789 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jxgzr/must-gather-78m7n" event={"ID":"4c1f251c-2be2-460a-aa78-fca33bed879f","Type":"ContainerStarted","Data":"b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7"} Mar 10 00:37:05 crc kubenswrapper[4994]: I0310 00:37:05.863529 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jxgzr/must-gather-78m7n" event={"ID":"4c1f251c-2be2-460a-aa78-fca33bed879f","Type":"ContainerStarted","Data":"54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7"} Mar 10 00:37:05 crc kubenswrapper[4994]: I0310 00:37:05.887311 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jxgzr/must-gather-78m7n" podStartSLOduration=1.712485998 podStartE2EDuration="7.887278737s" podCreationTimestamp="2026-03-10 00:36:58 +0000 UTC" firstStartedPulling="2026-03-10 00:36:58.836305236 +0000 UTC m=+1833.010011985" lastFinishedPulling="2026-03-10 00:37:05.011097925 +0000 UTC m=+1839.184804724" observedRunningTime="2026-03-10 00:37:05.88160341 +0000 UTC m=+1840.055310159" watchObservedRunningTime="2026-03-10 00:37:05.887278737 +0000 UTC m=+1840.060985526" Mar 10 00:37:13 crc kubenswrapper[4994]: I0310 00:37:13.553566 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:37:13 crc kubenswrapper[4994]: E0310 00:37:13.554633 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:37:27 crc kubenswrapper[4994]: I0310 00:37:27.554670 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:37:27 crc kubenswrapper[4994]: E0310 00:37:27.555819 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:37:33 crc kubenswrapper[4994]: I0310 00:37:33.398199 4994 scope.go:117] "RemoveContainer" containerID="efddc60aba5bd5a67702715a9ba5bd81e253ee925326f34bfc3e8b98fe80390e" Mar 10 00:37:33 crc kubenswrapper[4994]: I0310 00:37:33.672762 4994 scope.go:117] "RemoveContainer" containerID="74761a9eff3f39fcf080fcf43da99f605fa6e3e4193c77e19940c640d07270e2" Mar 10 00:37:33 crc kubenswrapper[4994]: I0310 00:37:33.710314 4994 scope.go:117] "RemoveContainer" containerID="d932d5abafc702ac0d919613d1196b6f6540e0380b55b01cbf3b8f80be098bd1" Mar 10 00:37:33 crc kubenswrapper[4994]: I0310 00:37:33.752023 4994 scope.go:117] "RemoveContainer" containerID="7225b87c2068896f2fce33d1aefcc4a4a471fea15131e40f56a1db24cbb94f3e" Mar 10 00:37:40 crc kubenswrapper[4994]: I0310 00:37:40.554770 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:37:40 crc kubenswrapper[4994]: E0310 00:37:40.555557 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:37:55 crc kubenswrapper[4994]: I0310 00:37:55.553964 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:37:55 crc kubenswrapper[4994]: E0310 00:37:55.554867 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:37:57 crc kubenswrapper[4994]: I0310 00:37:57.225142 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-vjj5j_f15954a6-2036-4c32-a8b6-bc8e227d0fcd/control-plane-machine-set-operator/0.log" Mar 10 00:37:57 crc kubenswrapper[4994]: I0310 00:37:57.379327 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-m6jnx_fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d/kube-rbac-proxy/0.log" Mar 10 00:37:57 crc kubenswrapper[4994]: I0310 00:37:57.379532 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-m6jnx_fe6f59a5-bf2e-4926-b6f2-a18b4cb5479d/machine-api-operator/0.log" Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.148389 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551718-8lc8n"] Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.150266 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551718-8lc8n" Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.155424 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.155486 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.156320 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.158088 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551718-8lc8n"] Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.307804 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt67d\" (UniqueName: \"kubernetes.io/projected/b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe-kube-api-access-rt67d\") pod \"auto-csr-approver-29551718-8lc8n\" (UID: \"b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe\") " pod="openshift-infra/auto-csr-approver-29551718-8lc8n" Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.409188 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt67d\" (UniqueName: \"kubernetes.io/projected/b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe-kube-api-access-rt67d\") pod \"auto-csr-approver-29551718-8lc8n\" (UID: \"b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe\") " pod="openshift-infra/auto-csr-approver-29551718-8lc8n" Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.437454 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt67d\" (UniqueName: \"kubernetes.io/projected/b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe-kube-api-access-rt67d\") pod \"auto-csr-approver-29551718-8lc8n\" (UID: \"b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe\") " pod="openshift-infra/auto-csr-approver-29551718-8lc8n" Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.483108 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551718-8lc8n" Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.751167 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551718-8lc8n"] Mar 10 00:38:00 crc kubenswrapper[4994]: I0310 00:38:00.757590 4994 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 00:38:01 crc kubenswrapper[4994]: I0310 00:38:01.378249 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551718-8lc8n" event={"ID":"b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe","Type":"ContainerStarted","Data":"3b9fa2cf82ce023be37c84782ce38836922cc621c1c4818b6b06a13f5e0d6969"} Mar 10 00:38:02 crc kubenswrapper[4994]: I0310 00:38:02.389730 4994 generic.go:334] "Generic (PLEG): container finished" podID="b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe" containerID="75da607defaa0e2e04f032a83aef4a970ffb1092be73ab93d5a96db312ad3ddb" exitCode=0 Mar 10 00:38:02 crc kubenswrapper[4994]: I0310 00:38:02.389802 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551718-8lc8n" event={"ID":"b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe","Type":"ContainerDied","Data":"75da607defaa0e2e04f032a83aef4a970ffb1092be73ab93d5a96db312ad3ddb"} Mar 10 00:38:03 crc kubenswrapper[4994]: I0310 00:38:03.727938 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551718-8lc8n" Mar 10 00:38:03 crc kubenswrapper[4994]: I0310 00:38:03.863298 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt67d\" (UniqueName: \"kubernetes.io/projected/b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe-kube-api-access-rt67d\") pod \"b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe\" (UID: \"b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe\") " Mar 10 00:38:03 crc kubenswrapper[4994]: I0310 00:38:03.869106 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe-kube-api-access-rt67d" (OuterVolumeSpecName: "kube-api-access-rt67d") pod "b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe" (UID: "b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe"). InnerVolumeSpecName "kube-api-access-rt67d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:38:03 crc kubenswrapper[4994]: I0310 00:38:03.964786 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt67d\" (UniqueName: \"kubernetes.io/projected/b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe-kube-api-access-rt67d\") on node \"crc\" DevicePath \"\"" Mar 10 00:38:04 crc kubenswrapper[4994]: I0310 00:38:04.408681 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551718-8lc8n" event={"ID":"b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe","Type":"ContainerDied","Data":"3b9fa2cf82ce023be37c84782ce38836922cc621c1c4818b6b06a13f5e0d6969"} Mar 10 00:38:04 crc kubenswrapper[4994]: I0310 00:38:04.408722 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b9fa2cf82ce023be37c84782ce38836922cc621c1c4818b6b06a13f5e0d6969" Mar 10 00:38:04 crc kubenswrapper[4994]: I0310 00:38:04.408798 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551718-8lc8n" Mar 10 00:38:04 crc kubenswrapper[4994]: I0310 00:38:04.816819 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551712-nx9pb"] Mar 10 00:38:04 crc kubenswrapper[4994]: I0310 00:38:04.827690 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551712-nx9pb"] Mar 10 00:38:06 crc kubenswrapper[4994]: I0310 00:38:06.564235 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="615394b2-0705-4358-853e-8c52eb448519" path="/var/lib/kubelet/pods/615394b2-0705-4358-853e-8c52eb448519/volumes" Mar 10 00:38:08 crc kubenswrapper[4994]: I0310 00:38:08.555368 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:38:08 crc kubenswrapper[4994]: E0310 00:38:08.557322 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:38:11 crc kubenswrapper[4994]: I0310 00:38:11.488801 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-jjkfq_943085e6-2580-48ae-9c2d-d83989c6204c/cert-manager-controller/0.log" Mar 10 00:38:11 crc kubenswrapper[4994]: I0310 00:38:11.641685 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-8qd55_c5f1e9a7-bff0-4565-9cef-d8904908dbfe/cert-manager-cainjector/0.log" Mar 10 00:38:11 crc kubenswrapper[4994]: I0310 00:38:11.674666 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-6qgfs_ef38e78a-b3a6-4de7-ba46-598693edf905/cert-manager-webhook/0.log" Mar 10 00:38:20 crc kubenswrapper[4994]: I0310 00:38:20.554408 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:38:20 crc kubenswrapper[4994]: E0310 00:38:20.555424 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:38:27 crc kubenswrapper[4994]: I0310 00:38:27.072691 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-fnj29_13e52713-fbfe-43ba-ae51-b13a060d8a05/prometheus-operator/0.log" Mar 10 00:38:27 crc kubenswrapper[4994]: I0310 00:38:27.234956 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw_08b7eb36-ad76-4d9a-9fe9-f37febcdfdab/prometheus-operator-admission-webhook/0.log" Mar 10 00:38:27 crc kubenswrapper[4994]: I0310 00:38:27.250423 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r_22d07ce7-cdcc-4804-8127-a4f3a9d1685f/prometheus-operator-admission-webhook/0.log" Mar 10 00:38:27 crc kubenswrapper[4994]: I0310 00:38:27.398623 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-2jk2w_9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e/operator/0.log" Mar 10 00:38:27 crc kubenswrapper[4994]: I0310 00:38:27.409753 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-gxbhj_65c6820f-4375-4de8-bcdf-0f0e2c4bcd87/perses-operator/0.log" Mar 10 00:38:32 crc kubenswrapper[4994]: I0310 00:38:32.554640 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:38:32 crc kubenswrapper[4994]: E0310 00:38:32.555659 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:38:33 crc kubenswrapper[4994]: I0310 00:38:33.858138 4994 scope.go:117] "RemoveContainer" containerID="ab12f6f7b139f927c15eec55fa9992338c6ae56c8336c6e012df890d87e1461b" Mar 10 00:38:42 crc kubenswrapper[4994]: I0310 00:38:42.731612 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw_c792896d-13dd-4202-a2b7-62aac3396c78/util/0.log" Mar 10 00:38:42 crc kubenswrapper[4994]: I0310 00:38:42.860323 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw_c792896d-13dd-4202-a2b7-62aac3396c78/util/0.log" Mar 10 00:38:42 crc kubenswrapper[4994]: I0310 00:38:42.891982 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw_c792896d-13dd-4202-a2b7-62aac3396c78/pull/0.log" Mar 10 00:38:42 crc kubenswrapper[4994]: I0310 00:38:42.907964 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw_c792896d-13dd-4202-a2b7-62aac3396c78/pull/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.040531 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw_c792896d-13dd-4202-a2b7-62aac3396c78/extract/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.045690 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw_c792896d-13dd-4202-a2b7-62aac3396c78/pull/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.089152 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fmpkkw_c792896d-13dd-4202-a2b7-62aac3396c78/util/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.217650 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk_b4b3e4dd-b86b-4442-9067-233a79e7942e/util/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.366036 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk_b4b3e4dd-b86b-4442-9067-233a79e7942e/pull/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.371558 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk_b4b3e4dd-b86b-4442-9067-233a79e7942e/pull/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.378033 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk_b4b3e4dd-b86b-4442-9067-233a79e7942e/util/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.574333 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk_b4b3e4dd-b86b-4442-9067-233a79e7942e/util/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.589604 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk_b4b3e4dd-b86b-4442-9067-233a79e7942e/pull/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.601313 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39ek9dfk_b4b3e4dd-b86b-4442-9067-233a79e7942e/extract/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.718647 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5_1b75d3a9-a107-4c28-afc2-7eb7e1357113/util/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.898522 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5_1b75d3a9-a107-4c28-afc2-7eb7e1357113/util/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.929771 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5_1b75d3a9-a107-4c28-afc2-7eb7e1357113/pull/0.log" Mar 10 00:38:43 crc kubenswrapper[4994]: I0310 00:38:43.962787 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5_1b75d3a9-a107-4c28-afc2-7eb7e1357113/pull/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.110218 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5_1b75d3a9-a107-4c28-afc2-7eb7e1357113/pull/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.110723 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5_1b75d3a9-a107-4c28-afc2-7eb7e1357113/util/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.130524 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52mgb5_1b75d3a9-a107-4c28-afc2-7eb7e1357113/extract/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.274645 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz_4dea22bc-f7b5-4722-b2c2-db96edfdcb96/util/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.481247 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz_4dea22bc-f7b5-4722-b2c2-db96edfdcb96/pull/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.484601 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz_4dea22bc-f7b5-4722-b2c2-db96edfdcb96/pull/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.486322 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz_4dea22bc-f7b5-4722-b2c2-db96edfdcb96/util/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.554055 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:38:44 crc kubenswrapper[4994]: E0310 00:38:44.554510 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.656494 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz_4dea22bc-f7b5-4722-b2c2-db96edfdcb96/pull/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.659173 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz_4dea22bc-f7b5-4722-b2c2-db96edfdcb96/util/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.681282 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rxrbz_4dea22bc-f7b5-4722-b2c2-db96edfdcb96/extract/0.log" Mar 10 00:38:44 crc kubenswrapper[4994]: I0310 00:38:44.833313 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4vf56_2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105/extract-utilities/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.023690 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4vf56_2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105/extract-content/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.034147 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4vf56_2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105/extract-content/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.041495 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4vf56_2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105/extract-utilities/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.205424 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4vf56_2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105/extract-content/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.222567 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4vf56_2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105/extract-utilities/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.406010 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-45nlb_2aaa4876-9545-4d43-b7a3-02d53c8ef8f5/extract-utilities/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.552998 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4vf56_2a4e3a5e-5559-4e0b-a9b5-f117c0dcf105/registry-server/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.589217 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-45nlb_2aaa4876-9545-4d43-b7a3-02d53c8ef8f5/extract-content/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.610538 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-45nlb_2aaa4876-9545-4d43-b7a3-02d53c8ef8f5/extract-utilities/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.644736 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-45nlb_2aaa4876-9545-4d43-b7a3-02d53c8ef8f5/extract-content/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.774983 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-45nlb_2aaa4876-9545-4d43-b7a3-02d53c8ef8f5/extract-utilities/0.log" Mar 10 00:38:45 crc kubenswrapper[4994]: I0310 00:38:45.781987 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-45nlb_2aaa4876-9545-4d43-b7a3-02d53c8ef8f5/extract-content/0.log" Mar 10 00:38:46 crc kubenswrapper[4994]: I0310 00:38:46.006649 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-ppfwk_46c4619e-ab9f-4fd9-9f3e-5b7ba9415823/marketplace-operator/0.log" Mar 10 00:38:46 crc kubenswrapper[4994]: I0310 00:38:46.058762 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dnvg_ad4ae94a-f55f-4133-9b34-f95992f5454b/extract-utilities/0.log" Mar 10 00:38:46 crc kubenswrapper[4994]: I0310 00:38:46.076952 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-45nlb_2aaa4876-9545-4d43-b7a3-02d53c8ef8f5/registry-server/0.log" Mar 10 00:38:46 crc kubenswrapper[4994]: I0310 00:38:46.210269 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dnvg_ad4ae94a-f55f-4133-9b34-f95992f5454b/extract-utilities/0.log" Mar 10 00:38:46 crc kubenswrapper[4994]: I0310 00:38:46.232937 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dnvg_ad4ae94a-f55f-4133-9b34-f95992f5454b/extract-content/0.log" Mar 10 00:38:46 crc kubenswrapper[4994]: I0310 00:38:46.239982 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dnvg_ad4ae94a-f55f-4133-9b34-f95992f5454b/extract-content/0.log" Mar 10 00:38:46 crc kubenswrapper[4994]: I0310 00:38:46.364322 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dnvg_ad4ae94a-f55f-4133-9b34-f95992f5454b/extract-content/0.log" Mar 10 00:38:46 crc kubenswrapper[4994]: I0310 00:38:46.374547 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dnvg_ad4ae94a-f55f-4133-9b34-f95992f5454b/extract-utilities/0.log" Mar 10 00:38:46 crc kubenswrapper[4994]: I0310 00:38:46.633337 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9dnvg_ad4ae94a-f55f-4133-9b34-f95992f5454b/registry-server/0.log" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.580759 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jq9wx"] Mar 10 00:38:47 crc kubenswrapper[4994]: E0310 00:38:47.581638 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe" containerName="oc" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.581663 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe" containerName="oc" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.581923 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0f2e086-e5c2-41e7-9a8f-bcb37217ecbe" containerName="oc" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.584463 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.589098 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jq9wx"] Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.663121 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-utilities\") pod \"certified-operators-jq9wx\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.663231 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqnlf\" (UniqueName: \"kubernetes.io/projected/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-kube-api-access-kqnlf\") pod \"certified-operators-jq9wx\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.663306 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-catalog-content\") pod \"certified-operators-jq9wx\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.765065 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-utilities\") pod \"certified-operators-jq9wx\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.765120 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqnlf\" (UniqueName: \"kubernetes.io/projected/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-kube-api-access-kqnlf\") pod \"certified-operators-jq9wx\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.765139 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-catalog-content\") pod \"certified-operators-jq9wx\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.765610 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-utilities\") pod \"certified-operators-jq9wx\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.765635 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-catalog-content\") pod \"certified-operators-jq9wx\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.785448 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqnlf\" (UniqueName: \"kubernetes.io/projected/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-kube-api-access-kqnlf\") pod \"certified-operators-jq9wx\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:47 crc kubenswrapper[4994]: I0310 00:38:47.934293 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:48 crc kubenswrapper[4994]: I0310 00:38:48.347160 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jq9wx"] Mar 10 00:38:48 crc kubenswrapper[4994]: I0310 00:38:48.744685 4994 generic.go:334] "Generic (PLEG): container finished" podID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerID="a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774" exitCode=0 Mar 10 00:38:48 crc kubenswrapper[4994]: I0310 00:38:48.744772 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq9wx" event={"ID":"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c","Type":"ContainerDied","Data":"a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774"} Mar 10 00:38:48 crc kubenswrapper[4994]: I0310 00:38:48.745008 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq9wx" event={"ID":"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c","Type":"ContainerStarted","Data":"36e610f9b1315284b8cc51fc123f9b0461ca200389e8e569cb119a259c7ea54d"} Mar 10 00:38:49 crc kubenswrapper[4994]: I0310 00:38:49.757439 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq9wx" event={"ID":"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c","Type":"ContainerStarted","Data":"24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62"} Mar 10 00:38:50 crc kubenswrapper[4994]: I0310 00:38:50.772290 4994 generic.go:334] "Generic (PLEG): container finished" podID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerID="24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62" exitCode=0 Mar 10 00:38:50 crc kubenswrapper[4994]: I0310 00:38:50.772330 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq9wx" event={"ID":"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c","Type":"ContainerDied","Data":"24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62"} Mar 10 00:38:51 crc kubenswrapper[4994]: I0310 00:38:51.787336 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq9wx" event={"ID":"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c","Type":"ContainerStarted","Data":"5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e"} Mar 10 00:38:51 crc kubenswrapper[4994]: I0310 00:38:51.812139 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jq9wx" podStartSLOduration=2.33666101 podStartE2EDuration="4.812112257s" podCreationTimestamp="2026-03-10 00:38:47 +0000 UTC" firstStartedPulling="2026-03-10 00:38:48.746888172 +0000 UTC m=+1942.920594922" lastFinishedPulling="2026-03-10 00:38:51.22233939 +0000 UTC m=+1945.396046169" observedRunningTime="2026-03-10 00:38:51.810074313 +0000 UTC m=+1945.983781072" watchObservedRunningTime="2026-03-10 00:38:51.812112257 +0000 UTC m=+1945.985819046" Mar 10 00:38:57 crc kubenswrapper[4994]: I0310 00:38:57.554725 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:38:57 crc kubenswrapper[4994]: E0310 00:38:57.555725 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:38:57 crc kubenswrapper[4994]: I0310 00:38:57.935454 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:57 crc kubenswrapper[4994]: I0310 00:38:57.935525 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:58 crc kubenswrapper[4994]: I0310 00:38:58.011767 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:58 crc kubenswrapper[4994]: I0310 00:38:58.921380 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:38:58 crc kubenswrapper[4994]: I0310 00:38:58.983980 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jq9wx"] Mar 10 00:39:00 crc kubenswrapper[4994]: I0310 00:39:00.875515 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jq9wx" podUID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerName="registry-server" containerID="cri-o://5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e" gracePeriod=2 Mar 10 00:39:00 crc kubenswrapper[4994]: I0310 00:39:00.895807 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f5545c74c-p2gkw_08b7eb36-ad76-4d9a-9fe9-f37febcdfdab/prometheus-operator-admission-webhook/0.log" Mar 10 00:39:00 crc kubenswrapper[4994]: I0310 00:39:00.912711 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-fnj29_13e52713-fbfe-43ba-ae51-b13a060d8a05/prometheus-operator/0.log" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.060935 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-2jk2w_9ec2ef1a-309f-4d22-b9e7-c6536fb8a46e/operator/0.log" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.098545 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-gxbhj_65c6820f-4375-4de8-bcdf-0f0e2c4bcd87/perses-operator/0.log" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.126340 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f5545c74c-qhw5r_22d07ce7-cdcc-4804-8127-a4f3a9d1685f/prometheus-operator-admission-webhook/0.log" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.244993 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.379256 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-utilities\") pod \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.379303 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-catalog-content\") pod \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.379325 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqnlf\" (UniqueName: \"kubernetes.io/projected/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-kube-api-access-kqnlf\") pod \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\" (UID: \"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c\") " Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.380097 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-utilities" (OuterVolumeSpecName: "utilities") pod "4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" (UID: "4fdd1ac4-0308-464f-82e1-2c9ef11ea84c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.384514 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-kube-api-access-kqnlf" (OuterVolumeSpecName: "kube-api-access-kqnlf") pod "4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" (UID: "4fdd1ac4-0308-464f-82e1-2c9ef11ea84c"). InnerVolumeSpecName "kube-api-access-kqnlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.430623 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" (UID: "4fdd1ac4-0308-464f-82e1-2c9ef11ea84c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.480998 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.481021 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.481030 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqnlf\" (UniqueName: \"kubernetes.io/projected/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c-kube-api-access-kqnlf\") on node \"crc\" DevicePath \"\"" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.884296 4994 generic.go:334] "Generic (PLEG): container finished" podID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerID="5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e" exitCode=0 Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.884338 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq9wx" event={"ID":"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c","Type":"ContainerDied","Data":"5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e"} Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.884367 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jq9wx" event={"ID":"4fdd1ac4-0308-464f-82e1-2c9ef11ea84c","Type":"ContainerDied","Data":"36e610f9b1315284b8cc51fc123f9b0461ca200389e8e569cb119a259c7ea54d"} Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.884389 4994 scope.go:117] "RemoveContainer" containerID="5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.884395 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jq9wx" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.912954 4994 scope.go:117] "RemoveContainer" containerID="24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.920466 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jq9wx"] Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.929422 4994 scope.go:117] "RemoveContainer" containerID="a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.949583 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jq9wx"] Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.953784 4994 scope.go:117] "RemoveContainer" containerID="5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e" Mar 10 00:39:01 crc kubenswrapper[4994]: E0310 00:39:01.954231 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e\": container with ID starting with 5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e not found: ID does not exist" containerID="5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.954285 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e"} err="failed to get container status \"5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e\": rpc error: code = NotFound desc = could not find container \"5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e\": container with ID starting with 5fc64e49114728811aaba12012eb52a8d99a84b604a145d1ec0f10b26216b76e not found: ID does not exist" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.954319 4994 scope.go:117] "RemoveContainer" containerID="24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62" Mar 10 00:39:01 crc kubenswrapper[4994]: E0310 00:39:01.954741 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62\": container with ID starting with 24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62 not found: ID does not exist" containerID="24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.954790 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62"} err="failed to get container status \"24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62\": rpc error: code = NotFound desc = could not find container \"24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62\": container with ID starting with 24b564fc0f4c379e4cf820e57c68bc3e36587dd30c0489d92c7bcd9056300a62 not found: ID does not exist" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.954825 4994 scope.go:117] "RemoveContainer" containerID="a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774" Mar 10 00:39:01 crc kubenswrapper[4994]: E0310 00:39:01.955155 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774\": container with ID starting with a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774 not found: ID does not exist" containerID="a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774" Mar 10 00:39:01 crc kubenswrapper[4994]: I0310 00:39:01.955185 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774"} err="failed to get container status \"a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774\": rpc error: code = NotFound desc = could not find container \"a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774\": container with ID starting with a6e88913def75f591ab5d355dd8c543dbde3efb30fadcd7290d64f7316419774 not found: ID does not exist" Mar 10 00:39:02 crc kubenswrapper[4994]: I0310 00:39:02.562816 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" path="/var/lib/kubelet/pods/4fdd1ac4-0308-464f-82e1-2c9ef11ea84c/volumes" Mar 10 00:39:11 crc kubenswrapper[4994]: I0310 00:39:11.554481 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:39:11 crc kubenswrapper[4994]: E0310 00:39:11.557638 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:39:22 crc kubenswrapper[4994]: I0310 00:39:22.554308 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:39:22 crc kubenswrapper[4994]: E0310 00:39:22.555498 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:39:35 crc kubenswrapper[4994]: I0310 00:39:35.553748 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:39:35 crc kubenswrapper[4994]: E0310 00:39:35.554782 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:39:41 crc kubenswrapper[4994]: I0310 00:39:41.817273 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q92kr"] Mar 10 00:39:41 crc kubenswrapper[4994]: E0310 00:39:41.818266 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerName="registry-server" Mar 10 00:39:41 crc kubenswrapper[4994]: I0310 00:39:41.818291 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerName="registry-server" Mar 10 00:39:41 crc kubenswrapper[4994]: E0310 00:39:41.818332 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerName="extract-content" Mar 10 00:39:41 crc kubenswrapper[4994]: I0310 00:39:41.818344 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerName="extract-content" Mar 10 00:39:41 crc kubenswrapper[4994]: E0310 00:39:41.818368 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerName="extract-utilities" Mar 10 00:39:41 crc kubenswrapper[4994]: I0310 00:39:41.818382 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerName="extract-utilities" Mar 10 00:39:41 crc kubenswrapper[4994]: I0310 00:39:41.818631 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fdd1ac4-0308-464f-82e1-2c9ef11ea84c" containerName="registry-server" Mar 10 00:39:41 crc kubenswrapper[4994]: I0310 00:39:41.820659 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:41 crc kubenswrapper[4994]: I0310 00:39:41.839861 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q92kr"] Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.005337 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-fh2tl"] Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.006521 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.015546 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drw5q\" (UniqueName: \"kubernetes.io/projected/ebb35288-8012-435c-acf9-93fa066af8fe-kube-api-access-drw5q\") pod \"community-operators-q92kr\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.015608 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-utilities\") pod \"community-operators-q92kr\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.015651 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-catalog-content\") pod \"community-operators-q92kr\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.031103 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-fh2tl"] Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.116712 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-utilities\") pod \"community-operators-q92kr\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.116804 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-catalog-content\") pod \"community-operators-q92kr\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.116936 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m48fb\" (UniqueName: \"kubernetes.io/projected/2808be1e-48f1-4d08-98aa-c58ef6d4c153-kube-api-access-m48fb\") pod \"infrawatch-operators-fh2tl\" (UID: \"2808be1e-48f1-4d08-98aa-c58ef6d4c153\") " pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.117023 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drw5q\" (UniqueName: \"kubernetes.io/projected/ebb35288-8012-435c-acf9-93fa066af8fe-kube-api-access-drw5q\") pod \"community-operators-q92kr\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.117646 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-utilities\") pod \"community-operators-q92kr\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.117681 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-catalog-content\") pod \"community-operators-q92kr\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.139444 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drw5q\" (UniqueName: \"kubernetes.io/projected/ebb35288-8012-435c-acf9-93fa066af8fe-kube-api-access-drw5q\") pod \"community-operators-q92kr\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.188335 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.218271 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m48fb\" (UniqueName: \"kubernetes.io/projected/2808be1e-48f1-4d08-98aa-c58ef6d4c153-kube-api-access-m48fb\") pod \"infrawatch-operators-fh2tl\" (UID: \"2808be1e-48f1-4d08-98aa-c58ef6d4c153\") " pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.239446 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m48fb\" (UniqueName: \"kubernetes.io/projected/2808be1e-48f1-4d08-98aa-c58ef6d4c153-kube-api-access-m48fb\") pod \"infrawatch-operators-fh2tl\" (UID: \"2808be1e-48f1-4d08-98aa-c58ef6d4c153\") " pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.334073 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.491401 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q92kr"] Mar 10 00:39:42 crc kubenswrapper[4994]: W0310 00:39:42.799108 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2808be1e_48f1_4d08_98aa_c58ef6d4c153.slice/crio-fbf0824f915ac92385acafbf7095682500c296e3f241107075559fc7e1785852 WatchSource:0}: Error finding container fbf0824f915ac92385acafbf7095682500c296e3f241107075559fc7e1785852: Status 404 returned error can't find the container with id fbf0824f915ac92385acafbf7095682500c296e3f241107075559fc7e1785852 Mar 10 00:39:42 crc kubenswrapper[4994]: I0310 00:39:42.799735 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-fh2tl"] Mar 10 00:39:43 crc kubenswrapper[4994]: I0310 00:39:43.282136 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-fh2tl" event={"ID":"2808be1e-48f1-4d08-98aa-c58ef6d4c153","Type":"ContainerStarted","Data":"b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02"} Mar 10 00:39:43 crc kubenswrapper[4994]: I0310 00:39:43.282226 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-fh2tl" event={"ID":"2808be1e-48f1-4d08-98aa-c58ef6d4c153","Type":"ContainerStarted","Data":"fbf0824f915ac92385acafbf7095682500c296e3f241107075559fc7e1785852"} Mar 10 00:39:43 crc kubenswrapper[4994]: I0310 00:39:43.285650 4994 generic.go:334] "Generic (PLEG): container finished" podID="ebb35288-8012-435c-acf9-93fa066af8fe" containerID="7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb" exitCode=0 Mar 10 00:39:43 crc kubenswrapper[4994]: I0310 00:39:43.285699 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q92kr" event={"ID":"ebb35288-8012-435c-acf9-93fa066af8fe","Type":"ContainerDied","Data":"7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb"} Mar 10 00:39:43 crc kubenswrapper[4994]: I0310 00:39:43.285735 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q92kr" event={"ID":"ebb35288-8012-435c-acf9-93fa066af8fe","Type":"ContainerStarted","Data":"f7f1fc60078657da38db0b539efc33423a99b403512bcdfb6d438710e442e259"} Mar 10 00:39:43 crc kubenswrapper[4994]: I0310 00:39:43.310815 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-fh2tl" podStartSLOduration=2.180513493 podStartE2EDuration="2.31079108s" podCreationTimestamp="2026-03-10 00:39:41 +0000 UTC" firstStartedPulling="2026-03-10 00:39:42.80402288 +0000 UTC m=+1996.977729669" lastFinishedPulling="2026-03-10 00:39:42.934300477 +0000 UTC m=+1997.108007256" observedRunningTime="2026-03-10 00:39:43.3050637 +0000 UTC m=+1997.478770519" watchObservedRunningTime="2026-03-10 00:39:43.31079108 +0000 UTC m=+1997.484497859" Mar 10 00:39:45 crc kubenswrapper[4994]: I0310 00:39:45.319142 4994 generic.go:334] "Generic (PLEG): container finished" podID="ebb35288-8012-435c-acf9-93fa066af8fe" containerID="d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119" exitCode=0 Mar 10 00:39:45 crc kubenswrapper[4994]: I0310 00:39:45.319218 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q92kr" event={"ID":"ebb35288-8012-435c-acf9-93fa066af8fe","Type":"ContainerDied","Data":"d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119"} Mar 10 00:39:46 crc kubenswrapper[4994]: I0310 00:39:46.333819 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q92kr" event={"ID":"ebb35288-8012-435c-acf9-93fa066af8fe","Type":"ContainerStarted","Data":"563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46"} Mar 10 00:39:47 crc kubenswrapper[4994]: I0310 00:39:47.554866 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:39:47 crc kubenswrapper[4994]: E0310 00:39:47.558021 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:39:49 crc kubenswrapper[4994]: I0310 00:39:49.359657 4994 generic.go:334] "Generic (PLEG): container finished" podID="4c1f251c-2be2-460a-aa78-fca33bed879f" containerID="54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7" exitCode=0 Mar 10 00:39:49 crc kubenswrapper[4994]: I0310 00:39:49.359743 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jxgzr/must-gather-78m7n" event={"ID":"4c1f251c-2be2-460a-aa78-fca33bed879f","Type":"ContainerDied","Data":"54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7"} Mar 10 00:39:49 crc kubenswrapper[4994]: I0310 00:39:49.360481 4994 scope.go:117] "RemoveContainer" containerID="54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7" Mar 10 00:39:49 crc kubenswrapper[4994]: I0310 00:39:49.388584 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q92kr" podStartSLOduration=5.891254908 podStartE2EDuration="8.388558638s" podCreationTimestamp="2026-03-10 00:39:41 +0000 UTC" firstStartedPulling="2026-03-10 00:39:43.299527545 +0000 UTC m=+1997.473234324" lastFinishedPulling="2026-03-10 00:39:45.796831265 +0000 UTC m=+1999.970538054" observedRunningTime="2026-03-10 00:39:46.377028942 +0000 UTC m=+2000.550735701" watchObservedRunningTime="2026-03-10 00:39:49.388558638 +0000 UTC m=+2003.562265427" Mar 10 00:39:50 crc kubenswrapper[4994]: I0310 00:39:50.294740 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jxgzr_must-gather-78m7n_4c1f251c-2be2-460a-aa78-fca33bed879f/gather/0.log" Mar 10 00:39:52 crc kubenswrapper[4994]: I0310 00:39:52.188699 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:52 crc kubenswrapper[4994]: I0310 00:39:52.189100 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:52 crc kubenswrapper[4994]: I0310 00:39:52.260645 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:52 crc kubenswrapper[4994]: I0310 00:39:52.335489 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:52 crc kubenswrapper[4994]: I0310 00:39:52.335569 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:52 crc kubenswrapper[4994]: I0310 00:39:52.384163 4994 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:52 crc kubenswrapper[4994]: I0310 00:39:52.439723 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:52 crc kubenswrapper[4994]: I0310 00:39:52.470420 4994 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:55 crc kubenswrapper[4994]: I0310 00:39:55.599900 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q92kr"] Mar 10 00:39:55 crc kubenswrapper[4994]: I0310 00:39:55.600530 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q92kr" podUID="ebb35288-8012-435c-acf9-93fa066af8fe" containerName="registry-server" containerID="cri-o://563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46" gracePeriod=2 Mar 10 00:39:55 crc kubenswrapper[4994]: I0310 00:39:55.803660 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-fh2tl"] Mar 10 00:39:55 crc kubenswrapper[4994]: I0310 00:39:55.804305 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-fh2tl" podUID="2808be1e-48f1-4d08-98aa-c58ef6d4c153" containerName="registry-server" containerID="cri-o://b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02" gracePeriod=2 Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.095084 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.231317 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.282949 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-utilities\") pod \"ebb35288-8012-435c-acf9-93fa066af8fe\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.283109 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m48fb\" (UniqueName: \"kubernetes.io/projected/2808be1e-48f1-4d08-98aa-c58ef6d4c153-kube-api-access-m48fb\") pod \"2808be1e-48f1-4d08-98aa-c58ef6d4c153\" (UID: \"2808be1e-48f1-4d08-98aa-c58ef6d4c153\") " Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.283156 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-catalog-content\") pod \"ebb35288-8012-435c-acf9-93fa066af8fe\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.283188 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drw5q\" (UniqueName: \"kubernetes.io/projected/ebb35288-8012-435c-acf9-93fa066af8fe-kube-api-access-drw5q\") pod \"ebb35288-8012-435c-acf9-93fa066af8fe\" (UID: \"ebb35288-8012-435c-acf9-93fa066af8fe\") " Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.284580 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-utilities" (OuterVolumeSpecName: "utilities") pod "ebb35288-8012-435c-acf9-93fa066af8fe" (UID: "ebb35288-8012-435c-acf9-93fa066af8fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.289167 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebb35288-8012-435c-acf9-93fa066af8fe-kube-api-access-drw5q" (OuterVolumeSpecName: "kube-api-access-drw5q") pod "ebb35288-8012-435c-acf9-93fa066af8fe" (UID: "ebb35288-8012-435c-acf9-93fa066af8fe"). InnerVolumeSpecName "kube-api-access-drw5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.290178 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2808be1e-48f1-4d08-98aa-c58ef6d4c153-kube-api-access-m48fb" (OuterVolumeSpecName: "kube-api-access-m48fb") pod "2808be1e-48f1-4d08-98aa-c58ef6d4c153" (UID: "2808be1e-48f1-4d08-98aa-c58ef6d4c153"). InnerVolumeSpecName "kube-api-access-m48fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.333576 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebb35288-8012-435c-acf9-93fa066af8fe" (UID: "ebb35288-8012-435c-acf9-93fa066af8fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.385296 4994 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.385329 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m48fb\" (UniqueName: \"kubernetes.io/projected/2808be1e-48f1-4d08-98aa-c58ef6d4c153-kube-api-access-m48fb\") on node \"crc\" DevicePath \"\"" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.385339 4994 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb35288-8012-435c-acf9-93fa066af8fe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.385347 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drw5q\" (UniqueName: \"kubernetes.io/projected/ebb35288-8012-435c-acf9-93fa066af8fe-kube-api-access-drw5q\") on node \"crc\" DevicePath \"\"" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.446376 4994 generic.go:334] "Generic (PLEG): container finished" podID="2808be1e-48f1-4d08-98aa-c58ef6d4c153" containerID="b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02" exitCode=0 Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.446584 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-fh2tl" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.446618 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-fh2tl" event={"ID":"2808be1e-48f1-4d08-98aa-c58ef6d4c153","Type":"ContainerDied","Data":"b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02"} Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.446670 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-fh2tl" event={"ID":"2808be1e-48f1-4d08-98aa-c58ef6d4c153","Type":"ContainerDied","Data":"fbf0824f915ac92385acafbf7095682500c296e3f241107075559fc7e1785852"} Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.446710 4994 scope.go:117] "RemoveContainer" containerID="b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.450678 4994 generic.go:334] "Generic (PLEG): container finished" podID="ebb35288-8012-435c-acf9-93fa066af8fe" containerID="563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46" exitCode=0 Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.450768 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q92kr" event={"ID":"ebb35288-8012-435c-acf9-93fa066af8fe","Type":"ContainerDied","Data":"563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46"} Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.450822 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q92kr" event={"ID":"ebb35288-8012-435c-acf9-93fa066af8fe","Type":"ContainerDied","Data":"f7f1fc60078657da38db0b539efc33423a99b403512bcdfb6d438710e442e259"} Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.451057 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q92kr" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.472620 4994 scope.go:117] "RemoveContainer" containerID="b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02" Mar 10 00:39:56 crc kubenswrapper[4994]: E0310 00:39:56.473108 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02\": container with ID starting with b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02 not found: ID does not exist" containerID="b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.473160 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02"} err="failed to get container status \"b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02\": rpc error: code = NotFound desc = could not find container \"b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02\": container with ID starting with b86e13b39537c4dea35501075b2835c640f84b064dc6d5ac23a274e34471df02 not found: ID does not exist" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.473192 4994 scope.go:117] "RemoveContainer" containerID="563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.497448 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-fh2tl"] Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.507830 4994 scope.go:117] "RemoveContainer" containerID="d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.510187 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-fh2tl"] Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.525472 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q92kr"] Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.532933 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q92kr"] Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.540438 4994 scope.go:117] "RemoveContainer" containerID="7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.560660 4994 scope.go:117] "RemoveContainer" containerID="563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46" Mar 10 00:39:56 crc kubenswrapper[4994]: E0310 00:39:56.571171 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46\": container with ID starting with 563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46 not found: ID does not exist" containerID="563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.571440 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46"} err="failed to get container status \"563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46\": rpc error: code = NotFound desc = could not find container \"563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46\": container with ID starting with 563c57b1f3a8d6fb01588003012a574094259a0e8762798af0a467980d84ea46 not found: ID does not exist" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.571596 4994 scope.go:117] "RemoveContainer" containerID="d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119" Mar 10 00:39:56 crc kubenswrapper[4994]: E0310 00:39:56.572295 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119\": container with ID starting with d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119 not found: ID does not exist" containerID="d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.572359 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119"} err="failed to get container status \"d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119\": rpc error: code = NotFound desc = could not find container \"d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119\": container with ID starting with d991116cc4c372c1ab5a71ced2ba2ff8a5cd6f5af6ef488e2342b8e8b1ae1119 not found: ID does not exist" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.572401 4994 scope.go:117] "RemoveContainer" containerID="7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb" Mar 10 00:39:56 crc kubenswrapper[4994]: E0310 00:39:56.573348 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb\": container with ID starting with 7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb not found: ID does not exist" containerID="7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.573558 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb"} err="failed to get container status \"7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb\": rpc error: code = NotFound desc = could not find container \"7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb\": container with ID starting with 7150a201b49c0a1050ea3c47c87f350a054a389c10a7c23838dc825d683009cb not found: ID does not exist" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.575072 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2808be1e-48f1-4d08-98aa-c58ef6d4c153" path="/var/lib/kubelet/pods/2808be1e-48f1-4d08-98aa-c58ef6d4c153/volumes" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.575692 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebb35288-8012-435c-acf9-93fa066af8fe" path="/var/lib/kubelet/pods/ebb35288-8012-435c-acf9-93fa066af8fe/volumes" Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.677558 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jxgzr/must-gather-78m7n"] Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.677987 4994 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-jxgzr/must-gather-78m7n" podUID="4c1f251c-2be2-460a-aa78-fca33bed879f" containerName="copy" containerID="cri-o://b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7" gracePeriod=2 Mar 10 00:39:56 crc kubenswrapper[4994]: I0310 00:39:56.689773 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jxgzr/must-gather-78m7n"] Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.059163 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jxgzr_must-gather-78m7n_4c1f251c-2be2-460a-aa78-fca33bed879f/copy/0.log" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.060101 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.197850 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdhc7\" (UniqueName: \"kubernetes.io/projected/4c1f251c-2be2-460a-aa78-fca33bed879f-kube-api-access-xdhc7\") pod \"4c1f251c-2be2-460a-aa78-fca33bed879f\" (UID: \"4c1f251c-2be2-460a-aa78-fca33bed879f\") " Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.198023 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4c1f251c-2be2-460a-aa78-fca33bed879f-must-gather-output\") pod \"4c1f251c-2be2-460a-aa78-fca33bed879f\" (UID: \"4c1f251c-2be2-460a-aa78-fca33bed879f\") " Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.203595 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c1f251c-2be2-460a-aa78-fca33bed879f-kube-api-access-xdhc7" (OuterVolumeSpecName: "kube-api-access-xdhc7") pod "4c1f251c-2be2-460a-aa78-fca33bed879f" (UID: "4c1f251c-2be2-460a-aa78-fca33bed879f"). InnerVolumeSpecName "kube-api-access-xdhc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.258076 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c1f251c-2be2-460a-aa78-fca33bed879f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4c1f251c-2be2-460a-aa78-fca33bed879f" (UID: "4c1f251c-2be2-460a-aa78-fca33bed879f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.300849 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdhc7\" (UniqueName: \"kubernetes.io/projected/4c1f251c-2be2-460a-aa78-fca33bed879f-kube-api-access-xdhc7\") on node \"crc\" DevicePath \"\"" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.300926 4994 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4c1f251c-2be2-460a-aa78-fca33bed879f-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.460610 4994 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jxgzr_must-gather-78m7n_4c1f251c-2be2-460a-aa78-fca33bed879f/copy/0.log" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.460998 4994 generic.go:334] "Generic (PLEG): container finished" podID="4c1f251c-2be2-460a-aa78-fca33bed879f" containerID="b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7" exitCode=143 Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.461047 4994 scope.go:117] "RemoveContainer" containerID="b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.461119 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jxgzr/must-gather-78m7n" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.519023 4994 scope.go:117] "RemoveContainer" containerID="54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.578156 4994 scope.go:117] "RemoveContainer" containerID="b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7" Mar 10 00:39:57 crc kubenswrapper[4994]: E0310 00:39:57.578862 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7\": container with ID starting with b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7 not found: ID does not exist" containerID="b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.578923 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7"} err="failed to get container status \"b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7\": rpc error: code = NotFound desc = could not find container \"b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7\": container with ID starting with b694a4af39eb4b4fd5da30ec8f15871e2f25d3396e9a71a2b17f43e716a1f1b7 not found: ID does not exist" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.578949 4994 scope.go:117] "RemoveContainer" containerID="54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7" Mar 10 00:39:57 crc kubenswrapper[4994]: E0310 00:39:57.579237 4994 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7\": container with ID starting with 54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7 not found: ID does not exist" containerID="54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7" Mar 10 00:39:57 crc kubenswrapper[4994]: I0310 00:39:57.579278 4994 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7"} err="failed to get container status \"54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7\": rpc error: code = NotFound desc = could not find container \"54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7\": container with ID starting with 54d3fcb8fac9afd2eb8b43dedf2ea24f9724fd19c32382386f722185948316a7 not found: ID does not exist" Mar 10 00:39:58 crc kubenswrapper[4994]: I0310 00:39:58.576025 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c1f251c-2be2-460a-aa78-fca33bed879f" path="/var/lib/kubelet/pods/4c1f251c-2be2-460a-aa78-fca33bed879f/volumes" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.138148 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551720-fld4g"] Mar 10 00:40:00 crc kubenswrapper[4994]: E0310 00:40:00.138957 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c1f251c-2be2-460a-aa78-fca33bed879f" containerName="gather" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.138972 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c1f251c-2be2-460a-aa78-fca33bed879f" containerName="gather" Mar 10 00:40:00 crc kubenswrapper[4994]: E0310 00:40:00.138980 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb35288-8012-435c-acf9-93fa066af8fe" containerName="extract-utilities" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.138987 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb35288-8012-435c-acf9-93fa066af8fe" containerName="extract-utilities" Mar 10 00:40:00 crc kubenswrapper[4994]: E0310 00:40:00.139002 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb35288-8012-435c-acf9-93fa066af8fe" containerName="extract-content" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.139009 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb35288-8012-435c-acf9-93fa066af8fe" containerName="extract-content" Mar 10 00:40:00 crc kubenswrapper[4994]: E0310 00:40:00.139022 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb35288-8012-435c-acf9-93fa066af8fe" containerName="registry-server" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.139027 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb35288-8012-435c-acf9-93fa066af8fe" containerName="registry-server" Mar 10 00:40:00 crc kubenswrapper[4994]: E0310 00:40:00.139036 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c1f251c-2be2-460a-aa78-fca33bed879f" containerName="copy" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.139043 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c1f251c-2be2-460a-aa78-fca33bed879f" containerName="copy" Mar 10 00:40:00 crc kubenswrapper[4994]: E0310 00:40:00.139058 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2808be1e-48f1-4d08-98aa-c58ef6d4c153" containerName="registry-server" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.139064 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="2808be1e-48f1-4d08-98aa-c58ef6d4c153" containerName="registry-server" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.139166 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c1f251c-2be2-460a-aa78-fca33bed879f" containerName="gather" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.139178 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="2808be1e-48f1-4d08-98aa-c58ef6d4c153" containerName="registry-server" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.139193 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c1f251c-2be2-460a-aa78-fca33bed879f" containerName="copy" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.139202 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebb35288-8012-435c-acf9-93fa066af8fe" containerName="registry-server" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.139609 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551720-fld4g" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.145792 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.146114 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.146132 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.153089 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551720-fld4g"] Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.245750 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpctq\" (UniqueName: \"kubernetes.io/projected/e4e99279-1a96-4b0e-b307-3d7badc31d87-kube-api-access-lpctq\") pod \"auto-csr-approver-29551720-fld4g\" (UID: \"e4e99279-1a96-4b0e-b307-3d7badc31d87\") " pod="openshift-infra/auto-csr-approver-29551720-fld4g" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.347312 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpctq\" (UniqueName: \"kubernetes.io/projected/e4e99279-1a96-4b0e-b307-3d7badc31d87-kube-api-access-lpctq\") pod \"auto-csr-approver-29551720-fld4g\" (UID: \"e4e99279-1a96-4b0e-b307-3d7badc31d87\") " pod="openshift-infra/auto-csr-approver-29551720-fld4g" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.370605 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpctq\" (UniqueName: \"kubernetes.io/projected/e4e99279-1a96-4b0e-b307-3d7badc31d87-kube-api-access-lpctq\") pod \"auto-csr-approver-29551720-fld4g\" (UID: \"e4e99279-1a96-4b0e-b307-3d7badc31d87\") " pod="openshift-infra/auto-csr-approver-29551720-fld4g" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.501822 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551720-fld4g" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.555434 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:40:00 crc kubenswrapper[4994]: E0310 00:40:00.555699 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:40:00 crc kubenswrapper[4994]: I0310 00:40:00.821650 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551720-fld4g"] Mar 10 00:40:00 crc kubenswrapper[4994]: W0310 00:40:00.822477 4994 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4e99279_1a96_4b0e_b307_3d7badc31d87.slice/crio-c90e9a986f2f2e53cd8946728ec99ceea9179d1f394dae89d448677f285894a4 WatchSource:0}: Error finding container c90e9a986f2f2e53cd8946728ec99ceea9179d1f394dae89d448677f285894a4: Status 404 returned error can't find the container with id c90e9a986f2f2e53cd8946728ec99ceea9179d1f394dae89d448677f285894a4 Mar 10 00:40:01 crc kubenswrapper[4994]: I0310 00:40:01.527354 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551720-fld4g" event={"ID":"e4e99279-1a96-4b0e-b307-3d7badc31d87","Type":"ContainerStarted","Data":"c90e9a986f2f2e53cd8946728ec99ceea9179d1f394dae89d448677f285894a4"} Mar 10 00:40:02 crc kubenswrapper[4994]: I0310 00:40:02.534683 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551720-fld4g" event={"ID":"e4e99279-1a96-4b0e-b307-3d7badc31d87","Type":"ContainerStarted","Data":"05c0570be18d37a454ce767ab9e7e5834f1cf6173510437b4d6ef60917692ea8"} Mar 10 00:40:02 crc kubenswrapper[4994]: I0310 00:40:02.568693 4994 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551720-fld4g" podStartSLOduration=1.378823154 podStartE2EDuration="2.568677335s" podCreationTimestamp="2026-03-10 00:40:00 +0000 UTC" firstStartedPulling="2026-03-10 00:40:00.825006759 +0000 UTC m=+2014.998713518" lastFinishedPulling="2026-03-10 00:40:02.01486091 +0000 UTC m=+2016.188567699" observedRunningTime="2026-03-10 00:40:02.56545439 +0000 UTC m=+2016.739161139" watchObservedRunningTime="2026-03-10 00:40:02.568677335 +0000 UTC m=+2016.742384084" Mar 10 00:40:03 crc kubenswrapper[4994]: I0310 00:40:03.552583 4994 generic.go:334] "Generic (PLEG): container finished" podID="e4e99279-1a96-4b0e-b307-3d7badc31d87" containerID="05c0570be18d37a454ce767ab9e7e5834f1cf6173510437b4d6ef60917692ea8" exitCode=0 Mar 10 00:40:03 crc kubenswrapper[4994]: I0310 00:40:03.552734 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551720-fld4g" event={"ID":"e4e99279-1a96-4b0e-b307-3d7badc31d87","Type":"ContainerDied","Data":"05c0570be18d37a454ce767ab9e7e5834f1cf6173510437b4d6ef60917692ea8"} Mar 10 00:40:04 crc kubenswrapper[4994]: I0310 00:40:04.965828 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551720-fld4g" Mar 10 00:40:05 crc kubenswrapper[4994]: I0310 00:40:05.135442 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpctq\" (UniqueName: \"kubernetes.io/projected/e4e99279-1a96-4b0e-b307-3d7badc31d87-kube-api-access-lpctq\") pod \"e4e99279-1a96-4b0e-b307-3d7badc31d87\" (UID: \"e4e99279-1a96-4b0e-b307-3d7badc31d87\") " Mar 10 00:40:05 crc kubenswrapper[4994]: I0310 00:40:05.143672 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4e99279-1a96-4b0e-b307-3d7badc31d87-kube-api-access-lpctq" (OuterVolumeSpecName: "kube-api-access-lpctq") pod "e4e99279-1a96-4b0e-b307-3d7badc31d87" (UID: "e4e99279-1a96-4b0e-b307-3d7badc31d87"). InnerVolumeSpecName "kube-api-access-lpctq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:40:05 crc kubenswrapper[4994]: I0310 00:40:05.237608 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpctq\" (UniqueName: \"kubernetes.io/projected/e4e99279-1a96-4b0e-b307-3d7badc31d87-kube-api-access-lpctq\") on node \"crc\" DevicePath \"\"" Mar 10 00:40:05 crc kubenswrapper[4994]: I0310 00:40:05.577500 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551720-fld4g" event={"ID":"e4e99279-1a96-4b0e-b307-3d7badc31d87","Type":"ContainerDied","Data":"c90e9a986f2f2e53cd8946728ec99ceea9179d1f394dae89d448677f285894a4"} Mar 10 00:40:05 crc kubenswrapper[4994]: I0310 00:40:05.577557 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551720-fld4g" Mar 10 00:40:05 crc kubenswrapper[4994]: I0310 00:40:05.577565 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c90e9a986f2f2e53cd8946728ec99ceea9179d1f394dae89d448677f285894a4" Mar 10 00:40:05 crc kubenswrapper[4994]: I0310 00:40:05.631092 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551714-s79ft"] Mar 10 00:40:05 crc kubenswrapper[4994]: I0310 00:40:05.642414 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551714-s79ft"] Mar 10 00:40:06 crc kubenswrapper[4994]: I0310 00:40:06.568540 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c" path="/var/lib/kubelet/pods/3a7e2a17-b4ac-46fe-a37a-f0f943d46d9c/volumes" Mar 10 00:40:14 crc kubenswrapper[4994]: I0310 00:40:14.554757 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:40:14 crc kubenswrapper[4994]: E0310 00:40:14.555726 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:40:29 crc kubenswrapper[4994]: I0310 00:40:29.554376 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:40:29 crc kubenswrapper[4994]: E0310 00:40:29.555575 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:40:33 crc kubenswrapper[4994]: I0310 00:40:33.984813 4994 scope.go:117] "RemoveContainer" containerID="55ae1bc05b680756a0fab6fc454424e48677cf98e3af7624cd80e10e8ec94e10" Mar 10 00:40:43 crc kubenswrapper[4994]: I0310 00:40:43.562995 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:40:43 crc kubenswrapper[4994]: E0310 00:40:43.564103 4994 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kfljj_openshift-machine-config-operator(ced5d66d-39df-4267-b801-e1e60d517ace)\"" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" podUID="ced5d66d-39df-4267-b801-e1e60d517ace" Mar 10 00:40:58 crc kubenswrapper[4994]: I0310 00:40:58.554799 4994 scope.go:117] "RemoveContainer" containerID="39f6584f9f344e1728ceafdb3ff8a574a4e53d3f67d47cf4b47a6140383e852b" Mar 10 00:40:59 crc kubenswrapper[4994]: I0310 00:40:59.095538 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kfljj" event={"ID":"ced5d66d-39df-4267-b801-e1e60d517ace","Type":"ContainerStarted","Data":"6c1ad9a8c7ff342b60b2a6e64cc726375287091375c2b73911e495c7acc74748"} Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.156959 4994 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551722-6kglk"] Mar 10 00:42:00 crc kubenswrapper[4994]: E0310 00:42:00.158372 4994 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e99279-1a96-4b0e-b307-3d7badc31d87" containerName="oc" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.158397 4994 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e99279-1a96-4b0e-b307-3d7badc31d87" containerName="oc" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.158657 4994 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4e99279-1a96-4b0e-b307-3d7badc31d87" containerName="oc" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.159388 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551722-6kglk" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.165855 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.166409 4994 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2f5rl" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.167818 4994 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.168777 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551722-6kglk"] Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.249018 4994 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv8vc\" (UniqueName: \"kubernetes.io/projected/adcf87b3-4e04-4030-84eb-f132b3d94687-kube-api-access-sv8vc\") pod \"auto-csr-approver-29551722-6kglk\" (UID: \"adcf87b3-4e04-4030-84eb-f132b3d94687\") " pod="openshift-infra/auto-csr-approver-29551722-6kglk" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.351376 4994 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv8vc\" (UniqueName: \"kubernetes.io/projected/adcf87b3-4e04-4030-84eb-f132b3d94687-kube-api-access-sv8vc\") pod \"auto-csr-approver-29551722-6kglk\" (UID: \"adcf87b3-4e04-4030-84eb-f132b3d94687\") " pod="openshift-infra/auto-csr-approver-29551722-6kglk" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.386292 4994 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv8vc\" (UniqueName: \"kubernetes.io/projected/adcf87b3-4e04-4030-84eb-f132b3d94687-kube-api-access-sv8vc\") pod \"auto-csr-approver-29551722-6kglk\" (UID: \"adcf87b3-4e04-4030-84eb-f132b3d94687\") " pod="openshift-infra/auto-csr-approver-29551722-6kglk" Mar 10 00:42:00 crc kubenswrapper[4994]: I0310 00:42:00.495644 4994 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551722-6kglk" Mar 10 00:42:01 crc kubenswrapper[4994]: I0310 00:42:01.066369 4994 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551722-6kglk"] Mar 10 00:42:01 crc kubenswrapper[4994]: I0310 00:42:01.813531 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551722-6kglk" event={"ID":"adcf87b3-4e04-4030-84eb-f132b3d94687","Type":"ContainerStarted","Data":"7e436ade524ad6984f2d2252a39f09227dcc0b17383ba118b30f11a4a0b1373a"} Mar 10 00:42:02 crc kubenswrapper[4994]: I0310 00:42:02.828919 4994 generic.go:334] "Generic (PLEG): container finished" podID="adcf87b3-4e04-4030-84eb-f132b3d94687" containerID="36044df81f0ee5cf4a106de302e414c77b5e88caeb4ad173fa482133a0b5fa04" exitCode=0 Mar 10 00:42:02 crc kubenswrapper[4994]: I0310 00:42:02.829115 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551722-6kglk" event={"ID":"adcf87b3-4e04-4030-84eb-f132b3d94687","Type":"ContainerDied","Data":"36044df81f0ee5cf4a106de302e414c77b5e88caeb4ad173fa482133a0b5fa04"} Mar 10 00:42:04 crc kubenswrapper[4994]: I0310 00:42:04.228335 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551722-6kglk" Mar 10 00:42:04 crc kubenswrapper[4994]: I0310 00:42:04.420689 4994 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv8vc\" (UniqueName: \"kubernetes.io/projected/adcf87b3-4e04-4030-84eb-f132b3d94687-kube-api-access-sv8vc\") pod \"adcf87b3-4e04-4030-84eb-f132b3d94687\" (UID: \"adcf87b3-4e04-4030-84eb-f132b3d94687\") " Mar 10 00:42:04 crc kubenswrapper[4994]: I0310 00:42:04.431352 4994 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adcf87b3-4e04-4030-84eb-f132b3d94687-kube-api-access-sv8vc" (OuterVolumeSpecName: "kube-api-access-sv8vc") pod "adcf87b3-4e04-4030-84eb-f132b3d94687" (UID: "adcf87b3-4e04-4030-84eb-f132b3d94687"). InnerVolumeSpecName "kube-api-access-sv8vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:42:04 crc kubenswrapper[4994]: I0310 00:42:04.522800 4994 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv8vc\" (UniqueName: \"kubernetes.io/projected/adcf87b3-4e04-4030-84eb-f132b3d94687-kube-api-access-sv8vc\") on node \"crc\" DevicePath \"\"" Mar 10 00:42:04 crc kubenswrapper[4994]: I0310 00:42:04.854597 4994 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551722-6kglk" event={"ID":"adcf87b3-4e04-4030-84eb-f132b3d94687","Type":"ContainerDied","Data":"7e436ade524ad6984f2d2252a39f09227dcc0b17383ba118b30f11a4a0b1373a"} Mar 10 00:42:04 crc kubenswrapper[4994]: I0310 00:42:04.854658 4994 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e436ade524ad6984f2d2252a39f09227dcc0b17383ba118b30f11a4a0b1373a" Mar 10 00:42:04 crc kubenswrapper[4994]: I0310 00:42:04.854678 4994 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551722-6kglk" Mar 10 00:42:05 crc kubenswrapper[4994]: I0310 00:42:05.315773 4994 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551716-7pfgn"] Mar 10 00:42:05 crc kubenswrapper[4994]: I0310 00:42:05.326014 4994 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551716-7pfgn"] Mar 10 00:42:06 crc kubenswrapper[4994]: I0310 00:42:06.569698 4994 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de54238c-3c16-4557-aaa0-fb321dc61ca7" path="/var/lib/kubelet/pods/de54238c-3c16-4557-aaa0-fb321dc61ca7/volumes"